Mar 18 12:09:39 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 12:09:39 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:09:40 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 12:09:40 crc kubenswrapper[4921]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:40 crc kubenswrapper[4921]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 12:09:40 crc kubenswrapper[4921]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:40 crc kubenswrapper[4921]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:40 crc kubenswrapper[4921]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 12:09:40 crc kubenswrapper[4921]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.923881 4921 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928913 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928935 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928939 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928944 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928948 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928952 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928956 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928962 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928966 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928969 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928974 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928978 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928983 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928989 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.928996 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929003 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929019 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929025 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929032 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929037 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929041 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929045 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929050 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929054 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929061 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929067 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929072 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929077 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929081 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929087 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929090 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929094 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929098 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929102 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929124 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929128 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929131 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929135 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929140 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929145 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929149 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929153 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929157 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929161 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929165 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929169 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929172 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929176 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929180 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929185 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929191 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929195 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929199 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929202 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929206 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929211 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929214 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929218 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929221 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929225 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929228 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929232 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929235 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929239 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929242 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929246 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929249 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929253 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929257 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929261 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.929264 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929359 4921 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929370 4921 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929377 4921 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929383 4921 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929389 4921 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929394 4921 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929400 4921 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929406 4921 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929410 4921 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929414 4921 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929420 4921 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929425 4921 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929429 4921 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929433 4921 flags.go:64] FLAG: --cgroup-root="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929437 4921 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929441 4921 flags.go:64] FLAG: --client-ca-file="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929445 4921 flags.go:64] FLAG: --cloud-config="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929449 4921 flags.go:64] FLAG: --cloud-provider="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929453 4921 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929460 4921 flags.go:64] FLAG: --cluster-domain="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929465 4921 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929469 4921 flags.go:64] FLAG: --config-dir="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929473 4921 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929477 4921 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929484 4921 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929488 4921 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929492 4921 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929497 4921 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929500 4921 flags.go:64] FLAG: --contention-profiling="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929506 4921 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929510 4921 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929514 4921 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929519 4921 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929525 4921 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929529 4921 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929533 4921 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929537 4921 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929542 4921 flags.go:64] FLAG: --enable-server="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929546 4921 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929551 4921 flags.go:64] FLAG: --event-burst="100" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929555 4921 flags.go:64] FLAG: --event-qps="50" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929559 4921 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929563 4921 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929569 4921 flags.go:64] FLAG: --eviction-hard="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929575 4921 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929579 4921 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929584 4921 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929589 4921 flags.go:64] FLAG: --eviction-soft="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929593 4921 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929598 4921 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929602 4921 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929606 4921 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929610 4921 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929614 4921 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929618 4921 flags.go:64] FLAG: --feature-gates="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929624 4921 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929627 4921 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929632 4921 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929637 4921 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929641 4921 flags.go:64] FLAG: --healthz-port="10248" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929646 4921 flags.go:64] FLAG: --help="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929650 4921 flags.go:64] FLAG: --hostname-override="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929655 4921 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929659 4921 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929663 4921 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929668 4921 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929671 4921 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929675 4921 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929680 4921 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929684 4921 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929688 4921 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929692 4921 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929696 4921 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929700 4921 flags.go:64] FLAG: --kube-reserved="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929705 4921 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929710 4921 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929714 4921 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929718 4921 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929723 4921 flags.go:64] FLAG: --lock-file="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929727 4921 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929731 4921 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929735 4921 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929742 4921 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929746 4921 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929750 4921 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929754 4921 flags.go:64] FLAG: --logging-format="text" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929759 4921 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929763 4921 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929767 4921 flags.go:64] FLAG: --manifest-url="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929771 4921 flags.go:64] FLAG: --manifest-url-header="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929778 4921 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929782 4921 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929788 4921 flags.go:64] FLAG: --max-pods="110" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929792 4921 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929797 4921 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929802 4921 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929809 4921 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929819 4921 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929824 4921 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929830 4921 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929843 4921 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929848 4921 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929853 4921 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929858 4921 flags.go:64] FLAG: --pod-cidr="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929865 4921 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929875 4921 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929880 4921 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929885 4921 flags.go:64] FLAG: --pods-per-core="0" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929889 4921 flags.go:64] FLAG: --port="10250" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929893 4921 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929897 4921 flags.go:64] FLAG: --provider-id="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929901 4921 flags.go:64] FLAG: --qos-reserved="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929905 4921 flags.go:64] FLAG: --read-only-port="10255" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929909 4921 flags.go:64] FLAG: --register-node="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929913 4921 flags.go:64] FLAG: --register-schedulable="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929917 4921 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929925 4921 flags.go:64] FLAG: --registry-burst="10" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929929 4921 flags.go:64] FLAG: --registry-qps="5" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929937 4921 flags.go:64] FLAG: --reserved-cpus="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929941 4921 flags.go:64] FLAG: --reserved-memory="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929947 4921 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929951 4921 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929956 4921 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929960 4921 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929965 4921 flags.go:64] FLAG: --runonce="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929970 4921 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929974 4921 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929979 4921 flags.go:64] FLAG: --seccomp-default="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929985 4921 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929990 4921 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.929999 4921 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930009 4921 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930016 4921 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930022 4921 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930027 4921 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930031 4921 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930035 4921 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930040 4921 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930044 4921 flags.go:64] FLAG: --system-cgroups="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930048 4921 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930057 4921 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930062 4921 flags.go:64] FLAG: --tls-cert-file="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930066 4921 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930085 4921 flags.go:64] FLAG: --tls-min-version="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930089 4921 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930093 4921 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930097 4921 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930101 4921 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930123 4921 flags.go:64] FLAG: --v="2" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930130 4921 flags.go:64] FLAG: --version="false" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930139 4921 flags.go:64] FLAG: --vmodule="" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930145 4921 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.930150 4921 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930259 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930264 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930268 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930272 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930276 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930280 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930284 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930287 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930293 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930297 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930301 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930305 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930308 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930312 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930315 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930320 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930324 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930327 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930331 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930334 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930338 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930341 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930345 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930349 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930353 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930356 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930360 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930363 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930366 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930372 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930375 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930378 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930382 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930386 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930389 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930393 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930397 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930400 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930404 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930407 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930411 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930414 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930420 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930452 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930456 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930461 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930465 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930468 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930471 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930475 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930479 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930482 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930486 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930491 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930496 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930500 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930505 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930510 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930515 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930520 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930524 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930531 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930534 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930538 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930541 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930545 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930548 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930552 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930559 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930563 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.930566 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.931356 4921 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.945853 4921 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.945912 4921 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946101 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946161 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946174 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946186 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946199 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946209 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946219 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946229 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946239 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946249 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946260 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946270 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946280 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946290 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946304 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946314 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946324 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946333 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946343 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946353 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946363 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946373 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946383 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946393 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946403 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946413 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946423 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946433 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946443 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946492 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946504 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946518 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946537 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946549 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946560 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946571 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946583 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946594 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946603 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946613 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946623 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946633 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946644 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946654 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946663 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946675 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946685 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946694 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946705 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946715 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946730 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946740 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946750 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946761 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946771 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946781 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946792 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946803 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946813 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946823 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946833 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946843 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946853 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946866 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946881 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946892 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946906 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946919 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946933 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946946 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.946958 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.946976 4921 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947421 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947445 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947458 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947470 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947482 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947493 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947503 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947514 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947524 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947535 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947545 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947555 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947564 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947575 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947588 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947598 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947608 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947619 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947630 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947641 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947652 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947662 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947672 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947682 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947693 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947703 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947714 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947725 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947735 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947745 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947755 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947765 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947775 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947786 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947796 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947810 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947824 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947838 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947849 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947860 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947871 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947882 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947893 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947903 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947913 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947923 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947934 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947944 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947953 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947963 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947978 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947988 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.947998 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948008 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948018 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948028 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948040 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948051 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948060 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948074 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948086 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948098 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948166 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948179 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948189 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948200 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948215 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948227 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948238 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948249 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:09:40 crc kubenswrapper[4921]: W0318 12:09:40.948261 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.948277 4921 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.948635 4921 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 12:09:40 crc kubenswrapper[4921]: E0318 12:09:40.954793 4921 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.961138 4921 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.961304 4921 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.963195 4921 server.go:997] "Starting client certificate rotation" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.963235 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.963440 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.990822 4921 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:09:40 crc kubenswrapper[4921]: E0318 12:09:40.994526 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:40 crc kubenswrapper[4921]: I0318 12:09:40.997478 4921 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.013414 4921 log.go:25] "Validated CRI v1 runtime API" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.052508 4921 log.go:25] "Validated CRI v1 image API" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.055242 4921 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.061858 4921 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-12-04-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.061914 4921 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.096929 4921 manager.go:217] Machine: {Timestamp:2026-03-18 12:09:41.092037442 +0000 UTC m=+0.641958171 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0e096851-48c9-4cbf-9c0d-a42cb1e79e38 BootID:5a791cc0-dddf-47b6-8995-f4a6b294e6ba Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:58:6b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:58:6b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:78:3d:b8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2c:27:b9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:77:3f:b0 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:69:b8:58 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:15:83:fd Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:fe:be:36:ae:9a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:95:58:3b:fe:64 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.097414 4921 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.097685 4921 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.100289 4921 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.100690 4921 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.100762 4921 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.101798 4921 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.101835 4921 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.102579 4921 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.102626 4921 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.103006 4921 state_mem.go:36] "Initialized new in-memory state store" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.103295 4921 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.110280 4921 kubelet.go:418] "Attempting to sync node with API server" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.110330 4921 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.110392 4921 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.110425 4921 kubelet.go:324] "Adding apiserver pod source" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.110453 4921 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.113617 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.113670 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.113761 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.113758 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.118873 4921 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.119922 4921 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.121770 4921 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127617 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127655 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127666 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127677 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127694 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127707 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127718 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127734 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127748 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127761 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127778 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127788 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.127820 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.128435 4921 server.go:1280] "Started kubelet" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.130209 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:41 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.130891 4921 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.130890 4921 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.132230 4921 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.133678 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.133743 4921 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.134008 4921 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.134128 4921 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.134305 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.134410 4921 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.134972 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.135098 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.135613 4921 server.go:460] "Adding debug handlers to kubelet server" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.135962 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="200ms" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.136791 4921 factory.go:153] Registering CRI-O factory Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.136905 4921 factory.go:221] Registration of the crio container factory successfully Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.137065 4921 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.137170 4921 factory.go:55] Registering systemd factory Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.137240 4921 factory.go:221] Registration of the systemd container factory successfully Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.137318 4921 factory.go:103] Registering Raw factory Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.137407 4921 manager.go:1196] Started watching for new ooms in manager Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.138451 4921 manager.go:319] Starting recovery of all containers Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.145894 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.200:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189dee3ea8091e46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,LastTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164027 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164195 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164219 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164241 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164263 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164285 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164312 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164335 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164362 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164386 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164410 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164431 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164453 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164482 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164505 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164526 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164584 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164609 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164631 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164653 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164675 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164697 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164717 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164740 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164762 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164789 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164835 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164891 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164927 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.164979 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165001 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165021 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165043 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165065 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165151 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165175 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.165197 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.166649 4921 manager.go:324] Recovery completed Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168385 4921 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168443 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168470 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168494 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168520 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168544 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168565 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168591 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168618 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168641 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168662 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168684 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168709 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168730 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168753 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168775 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168809 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168877 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168919 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168945 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168971 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.168998 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169032 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169057 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169080 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169102 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169171 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169195 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169219 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169241 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169264 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169290 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169313 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169335 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169356 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169378 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169399 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169421 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169443 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169467 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169491 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169513 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169535 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169576 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169598 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169621 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169645 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169667 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169690 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169714 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169737 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169759 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169780 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169802 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169824 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169848 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169873 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169894 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169917 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169942 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169966 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.169988 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170011 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170034 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170057 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170085 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170134 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170167 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170225 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170250 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170275 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170299 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170324 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170348 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170372 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170394 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170415 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170437 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170459 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170481 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170501 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170521 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170540 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170563 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170586 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170608 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170630 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170653 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170677 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170701 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170722 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170744 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170766 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170789 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170811 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170836 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170889 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170913 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170934 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170955 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.170978 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171001 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171023 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171046 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171070 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171091 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171151 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171175 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171197 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171220 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171241 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171264 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171289 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171309 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171329 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171349 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171370 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171389 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171409 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171431 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171453 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171473 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171506 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171526 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171547 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171568 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171587 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171607 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171627 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171649 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171669 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171690 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171711 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171731 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171754 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171774 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171795 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171816 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171837 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171857 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171876 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171896 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171917 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171939 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171963 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.171983 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172003 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172023 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172043 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172063 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172090 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172177 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172221 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172246 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172273 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172297 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172319 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172340 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172361 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172398 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172447 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172477 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172519 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172557 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172587 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172607 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172631 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172652 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172679 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172703 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172725 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172745 4921 reconstruct.go:97] "Volume reconstruction finished" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.172760 4921 reconciler.go:26] "Reconciler: start to sync state" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.187941 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.192309 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.192375 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.192389 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.194507 4921 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.194554 4921 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.194596 4921 state_mem.go:36] "Initialized new in-memory state store" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.204025 4921 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.206877 4921 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.207199 4921 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.207705 4921 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.207938 4921 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.215414 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.215481 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.222929 4921 policy_none.go:49] "None policy: Start" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.224214 4921 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.224282 4921 state_mem.go:35] "Initializing new in-memory state store" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.234462 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.287347 4921 manager.go:334] "Starting Device Plugin manager" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.287540 4921 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.287631 4921 server.go:79] "Starting device plugin registration server" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.288339 4921 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.288446 4921 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.289043 4921 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.289306 4921 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.289331 4921 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.295810 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.308263 4921 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.308530 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.309959 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.310020 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.310043 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.310399 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.310771 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.310846 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.311829 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.311870 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.311885 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.312259 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.312401 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.312458 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.312809 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.313029 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.313137 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314208 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314266 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314289 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314542 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314627 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314691 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314710 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314727 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.314813 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.316729 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.316756 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.316769 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.317079 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.317243 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.317303 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.317333 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.317373 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.317380 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.318790 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.318822 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.318884 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.319016 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.319052 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.319063 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.319341 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.319381 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.320471 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.320499 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.320510 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.336724 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="400ms" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375217 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375354 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375418 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375550 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375734 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375811 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375891 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375922 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.375955 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.376011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.376074 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.376133 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.376169 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.389306 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.390988 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.391037 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.391050 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.391086 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.391827 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477551 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477631 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477773 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477811 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477859 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477883 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477933 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477956 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.477978 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478022 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478091 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478140 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478162 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478184 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478437 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478529 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478546 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478580 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478584 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478664 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478751 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478757 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478810 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478818 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478909 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.478991 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.479033 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.592577 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.594359 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.594437 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.594450 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.594489 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.595196 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.655251 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.679582 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.687626 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.704449 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.709041 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.738881 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="800ms" Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.762012 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-cbabf9d3ff3b6394d4dd346ee756e57d69dab8e6bec5941f919da8a5daa1d469 WatchSource:0}: Error finding container cbabf9d3ff3b6394d4dd346ee756e57d69dab8e6bec5941f919da8a5daa1d469: Status 404 returned error can't find the container with id cbabf9d3ff3b6394d4dd346ee756e57d69dab8e6bec5941f919da8a5daa1d469 Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.763418 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f519370a08a53dbe19c4118230dd84c28790db584cfaff9364728fb0e3276d8d WatchSource:0}: Error finding container f519370a08a53dbe19c4118230dd84c28790db584cfaff9364728fb0e3276d8d: Status 404 returned error can't find the container with id f519370a08a53dbe19c4118230dd84c28790db584cfaff9364728fb0e3276d8d Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.766183 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-cf20f4975d322bf6e0a28d42728c02dc8c414d1c42e4c43efcdc48488c2b1100 WatchSource:0}: Error finding container cf20f4975d322bf6e0a28d42728c02dc8c414d1c42e4c43efcdc48488c2b1100: Status 404 returned error can't find the container with id cf20f4975d322bf6e0a28d42728c02dc8c414d1c42e4c43efcdc48488c2b1100 Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.768958 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a349cad7c52fdbb5e116970c22bc90147f1dc1864938e8aa70411b845e375ce0 WatchSource:0}: Error finding container a349cad7c52fdbb5e116970c22bc90147f1dc1864938e8aa70411b845e375ce0: Status 404 returned error can't find the container with id a349cad7c52fdbb5e116970c22bc90147f1dc1864938e8aa70411b845e375ce0 Mar 18 12:09:41 crc kubenswrapper[4921]: W0318 12:09:41.773275 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f919bdd074e241ba3d42908c3a46888f8b014352cb899573cba6db2f8a27927b WatchSource:0}: Error finding container f919bdd074e241ba3d42908c3a46888f8b014352cb899573cba6db2f8a27927b: Status 404 returned error can't find the container with id f919bdd074e241ba3d42908c3a46888f8b014352cb899573cba6db2f8a27927b Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.996415 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.998070 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.998140 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.998158 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:41 crc kubenswrapper[4921]: I0318 12:09:41.998200 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:41 crc kubenswrapper[4921]: E0318 12:09:41.998696 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.131952 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:42 crc kubenswrapper[4921]: W0318 12:09:42.212749 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:42 crc kubenswrapper[4921]: E0318 12:09:42.212966 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.212993 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cbabf9d3ff3b6394d4dd346ee756e57d69dab8e6bec5941f919da8a5daa1d469"} Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.214257 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f919bdd074e241ba3d42908c3a46888f8b014352cb899573cba6db2f8a27927b"} Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.215623 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a349cad7c52fdbb5e116970c22bc90147f1dc1864938e8aa70411b845e375ce0"} Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.216733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cf20f4975d322bf6e0a28d42728c02dc8c414d1c42e4c43efcdc48488c2b1100"} Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.218791 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f519370a08a53dbe19c4118230dd84c28790db584cfaff9364728fb0e3276d8d"} Mar 18 12:09:42 crc kubenswrapper[4921]: W0318 12:09:42.383085 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:42 crc kubenswrapper[4921]: E0318 12:09:42.383209 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:42 crc kubenswrapper[4921]: E0318 12:09:42.540691 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="1.6s" Mar 18 12:09:42 crc kubenswrapper[4921]: W0318 12:09:42.699151 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:42 crc kubenswrapper[4921]: E0318 12:09:42.699294 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:42 crc kubenswrapper[4921]: W0318 12:09:42.796482 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:42 crc kubenswrapper[4921]: E0318 12:09:42.796571 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.798891 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.801562 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.801661 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.801687 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:42 crc kubenswrapper[4921]: I0318 12:09:42.801735 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:42 crc kubenswrapper[4921]: E0318 12:09:42.802511 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.072415 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:43 crc kubenswrapper[4921]: E0318 12:09:43.073946 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.131517 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.225967 4921 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="377df8906e29239947eeda9a46c1eb59e453240c92156c917f279686e19ab9ff" exitCode=0 Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.226131 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.226093 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"377df8906e29239947eeda9a46c1eb59e453240c92156c917f279686e19ab9ff"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.227451 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.227500 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.227515 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.229770 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451" exitCode=0 Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.229868 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.230017 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.231522 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.231582 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.231599 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.233891 4921 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="be661ea2baf0b55b5e5e5a6d47cc8b87e4c327f161667cd0aefd0abae06aec37" exitCode=0 Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.233983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"be661ea2baf0b55b5e5e5a6d47cc8b87e4c327f161667cd0aefd0abae06aec37"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.234041 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.234788 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.235871 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.235938 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.235965 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.236341 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.236397 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.236419 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.237323 4921 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ff0759d89b392ce640bbc52813f7180578ecb602cbf4665752eeba3825ae7e0b" exitCode=0 Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.237430 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.237464 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ff0759d89b392ce640bbc52813f7180578ecb602cbf4665752eeba3825ae7e0b"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.238917 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.238961 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.238978 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.242567 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b5d42030aefc4b40d77063c2570cb4c9596965758c43c7db089b4fca3cb1f6f9"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.242622 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62b43cb52b340ab9a2342be17cc62328ab31f90756f4bc80a72d912e88ddc5c5"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.242651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce01f847123390b2e865adfe6bee10890f7e15ca2d059d5cffa0ca04991e5d3f"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.242675 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"acadff4bcb285c38774780df7e1790b5ff60121600c87cf90f3e2a464c849d26"} Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.242724 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.245546 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.245602 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:43 crc kubenswrapper[4921]: I0318 12:09:43.245615 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:43 crc kubenswrapper[4921]: E0318 12:09:43.375913 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.200:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189dee3ea8091e46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,LastTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.131322 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:44 crc kubenswrapper[4921]: E0318 12:09:44.141743 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="3.2s" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.248880 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8a8c3ebd5011452d83758267a3c06ce61d25aa54aca1835e414b7fd3ba18c0ee"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.248901 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.250443 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.250475 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.250488 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.253087 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"67e68e6b66881d538e8445b9462ce1e7e16262cacc6f26ce0fe652bc04e8198d"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.253136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6aafadcd082125854f61cfd3fdc360e7c3f4ea8a27c4b7856ffca263d039de2"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.253153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d950700c19c67f8cbc369a2dd2b3a7bee617a83034e6000a7615c6b0f002ee45"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.253190 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.254924 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.254959 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.254972 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:44 crc kubenswrapper[4921]: W0318 12:09:44.262749 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:44 crc kubenswrapper[4921]: E0318 12:09:44.262830 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.263153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.263203 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.263220 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.263231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.265022 4921 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ab028b2b394c150d86d5fe2fd0805244ceb83a1ab6cc3e2f4f28b26ae56761f3" exitCode=0 Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.265174 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.265319 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.265460 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ab028b2b394c150d86d5fe2fd0805244ceb83a1ab6cc3e2f4f28b26ae56761f3"} Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.267312 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.267344 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.267373 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.268144 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.268181 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.268193 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.402829 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.404235 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.404274 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.404287 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:44 crc kubenswrapper[4921]: I0318 12:09:44.404318 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:44 crc kubenswrapper[4921]: E0318 12:09:44.404793 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Mar 18 12:09:44 crc kubenswrapper[4921]: W0318 12:09:44.479573 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:44 crc kubenswrapper[4921]: E0318 12:09:44.479658 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:44 crc kubenswrapper[4921]: W0318 12:09:44.807285 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Mar 18 12:09:44 crc kubenswrapper[4921]: E0318 12:09:44.807385 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.257595 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.273278 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03c91fe64e872ca53795e1b1a802bb60d75f2b207e072b88e8aae6bce41ee52b"} Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.273517 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.274816 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.274868 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.274891 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.281240 4921 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1ad29a81be936628330b9179fa5103474a6cfccacc3500cff8afa4baad68dec2" exitCode=0 Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.281450 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.282436 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.283102 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1ad29a81be936628330b9179fa5103474a6cfccacc3500cff8afa4baad68dec2"} Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.283265 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.284006 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.284063 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.285648 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.285700 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.285723 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.286897 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.286941 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.286962 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.287905 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.287950 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.287969 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.288316 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.288336 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.288346 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:45 crc kubenswrapper[4921]: I0318 12:09:45.839631 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.290854 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31eb691c3e1c8c0527745e818792498d59d18c34d1d2b58e8336613f7e53163d"} Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.290919 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"32f25014ce0a7257da0a0c384450b1c4ed0c35fd291d49cbd461d8e78755a15b"} Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.290944 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.290947 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e72099b077cb1d70d069aa0b2e49fc0a433fef3aab7dbbfd4c9bc83a90b29d3"} Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.291002 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.292467 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.292500 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.292508 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.644313 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.644776 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.646698 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.646932 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.647144 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.666288 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:46 crc kubenswrapper[4921]: I0318 12:09:46.674305 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.299222 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"88f19cd12b0a6a087f91f364fee4edb0307928c09d78a7ac85f77824af18a8fd"} Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.299293 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"42841e89282dbfba40ffa9bad76d3b7a57b83641b0777daf0473a4dd3db89588"} Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.299337 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.299337 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.299538 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.300328 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.300900 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.300969 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.300997 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.301588 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.301643 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.301661 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.301858 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.301924 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.301949 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.428002 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.605704 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.607546 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.607611 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.607631 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:47 crc kubenswrapper[4921]: I0318 12:09:47.607671 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.301850 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.301896 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.301904 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.303156 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.303193 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.303205 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.303301 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.303352 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.303363 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.341209 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.341434 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.343041 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.343102 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.343151 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:48 crc kubenswrapper[4921]: I0318 12:09:48.996005 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.304631 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.306054 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.306097 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.306124 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.572868 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.573162 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.574634 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.574679 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:49 crc kubenswrapper[4921]: I0318 12:09:49.574692 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.498256 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.498555 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.500345 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.500381 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.500394 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.703164 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.703543 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.705945 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.705989 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:50 crc kubenswrapper[4921]: I0318 12:09:50.706002 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:51 crc kubenswrapper[4921]: E0318 12:09:51.295949 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:09:51 crc kubenswrapper[4921]: I0318 12:09:51.341980 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:09:51 crc kubenswrapper[4921]: I0318 12:09:51.342141 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:09:52 crc kubenswrapper[4921]: I0318 12:09:52.881398 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 12:09:52 crc kubenswrapper[4921]: I0318 12:09:52.881792 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:52 crc kubenswrapper[4921]: I0318 12:09:52.883703 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:52 crc kubenswrapper[4921]: I0318 12:09:52.883783 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:52 crc kubenswrapper[4921]: I0318 12:09:52.883796 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.131530 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.266079 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.266348 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.268239 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.268300 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.268326 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:55 crc kubenswrapper[4921]: W0318 12:09:55.300840 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.300982 4921 trace.go:236] Trace[1983125702]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 12:09:45.298) (total time: 10002ms): Mar 18 12:09:55 crc kubenswrapper[4921]: Trace[1983125702]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:09:55.300) Mar 18 12:09:55 crc kubenswrapper[4921]: Trace[1983125702]: [10.002017977s] [10.002017977s] END Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.301024 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 12:09:55 crc kubenswrapper[4921]: W0318 12:09:55.604921 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.605040 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:55 crc kubenswrapper[4921]: W0318 12:09:55.606507 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.606574 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:55 crc kubenswrapper[4921]: W0318 12:09:55.607564 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.607602 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.608588 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.609162 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee3ea8091e46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,LastTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.610756 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 12:09:55 crc kubenswrapper[4921]: E0318 12:09:55.617930 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.627566 4921 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.627692 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.632437 4921 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.632526 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.845199 4921 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]log ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]etcd ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-apiextensions-informers ok Mar 18 12:09:55 crc kubenswrapper[4921]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 18 12:09:55 crc kubenswrapper[4921]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 12:09:55 crc kubenswrapper[4921]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 12:09:55 crc kubenswrapper[4921]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/bootstrap-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-registration-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]autoregister-completion ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 12:09:55 crc kubenswrapper[4921]: livez check failed Mar 18 12:09:55 crc kubenswrapper[4921]: I0318 12:09:55.845282 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.134994 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:56Z is after 2026-02-23T05:33:13Z Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.328769 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.331250 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="03c91fe64e872ca53795e1b1a802bb60d75f2b207e072b88e8aae6bce41ee52b" exitCode=255 Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.331315 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"03c91fe64e872ca53795e1b1a802bb60d75f2b207e072b88e8aae6bce41ee52b"} Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.331515 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.332791 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.333310 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.333353 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:56 crc kubenswrapper[4921]: I0318 12:09:56.334387 4921 scope.go:117] "RemoveContainer" containerID="03c91fe64e872ca53795e1b1a802bb60d75f2b207e072b88e8aae6bce41ee52b" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.134638 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:57Z is after 2026-02-23T05:33:13Z Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.336432 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.337288 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.339162 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" exitCode=255 Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.339171 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165"} Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.339316 4921 scope.go:117] "RemoveContainer" containerID="03c91fe64e872ca53795e1b1a802bb60d75f2b207e072b88e8aae6bce41ee52b" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.339556 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.341071 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.341102 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.341199 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:57 crc kubenswrapper[4921]: I0318 12:09:57.341643 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:09:57 crc kubenswrapper[4921]: E0318 12:09:57.341813 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:09:58 crc kubenswrapper[4921]: I0318 12:09:58.136719 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:58Z is after 2026-02-23T05:33:13Z Mar 18 12:09:58 crc kubenswrapper[4921]: I0318 12:09:58.345015 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.135885 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:09:59Z is after 2026-02-23T05:33:13Z Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.573928 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.574232 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.575841 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.575918 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.575941 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:09:59 crc kubenswrapper[4921]: I0318 12:09:59.576869 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:09:59 crc kubenswrapper[4921]: E0318 12:09:59.577202 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.136750 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:00Z is after 2026-02-23T05:33:13Z Mar 18 12:10:00 crc kubenswrapper[4921]: W0318 12:10:00.646589 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:00Z is after 2026-02-23T05:33:13Z Mar 18 12:10:00 crc kubenswrapper[4921]: E0318 12:10:00.646736 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.848896 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.849452 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.855580 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.855637 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.855658 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.856623 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:10:00 crc kubenswrapper[4921]: E0318 12:10:00.856936 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:00 crc kubenswrapper[4921]: I0318 12:10:00.859980 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.135849 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:01Z is after 2026-02-23T05:33:13Z Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.356458 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.357517 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.357549 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.357561 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.358230 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:10:01 crc kubenswrapper[4921]: E0318 12:10:01.358464 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:01 crc kubenswrapper[4921]: E0318 12:10:01.943870 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.945161 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:01 crc kubenswrapper[4921]: I0318 12:10:01.945460 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:02 crc kubenswrapper[4921]: E0318 12:10:02.015503 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.018642 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.020427 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.020505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.020522 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.020554 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:02 crc kubenswrapper[4921]: E0318 12:10:02.025527 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.136555 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.185509 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.359150 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.360364 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.360439 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.360467 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.361578 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:10:02 crc kubenswrapper[4921]: E0318 12:10:02.361955 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:02 crc kubenswrapper[4921]: W0318 12:10:02.622875 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z Mar 18 12:10:02 crc kubenswrapper[4921]: E0318 12:10:02.623039 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:02 crc kubenswrapper[4921]: W0318 12:10:02.702222 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z Mar 18 12:10:02 crc kubenswrapper[4921]: E0318 12:10:02.702345 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.914881 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.915160 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.916367 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.916412 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.916428 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:02 crc kubenswrapper[4921]: I0318 12:10:02.931554 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 12:10:03 crc kubenswrapper[4921]: I0318 12:10:03.136660 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:03Z is after 2026-02-23T05:33:13Z Mar 18 12:10:03 crc kubenswrapper[4921]: I0318 12:10:03.362254 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:03 crc kubenswrapper[4921]: I0318 12:10:03.363509 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:03 crc kubenswrapper[4921]: I0318 12:10:03.363588 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:03 crc kubenswrapper[4921]: I0318 12:10:03.363608 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:03 crc kubenswrapper[4921]: I0318 12:10:03.755726 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:03 crc kubenswrapper[4921]: E0318 12:10:03.759689 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:04 crc kubenswrapper[4921]: I0318 12:10:04.136412 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:04Z is after 2026-02-23T05:33:13Z Mar 18 12:10:05 crc kubenswrapper[4921]: I0318 12:10:05.135521 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:05Z is after 2026-02-23T05:33:13Z Mar 18 12:10:05 crc kubenswrapper[4921]: E0318 12:10:05.615098 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee3ea8091e46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,LastTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:06 crc kubenswrapper[4921]: I0318 12:10:06.133960 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:06Z is after 2026-02-23T05:33:13Z Mar 18 12:10:07 crc kubenswrapper[4921]: W0318 12:10:07.090026 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:07Z is after 2026-02-23T05:33:13Z Mar 18 12:10:07 crc kubenswrapper[4921]: E0318 12:10:07.090205 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:07 crc kubenswrapper[4921]: I0318 12:10:07.135654 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:07Z is after 2026-02-23T05:33:13Z Mar 18 12:10:07 crc kubenswrapper[4921]: W0318 12:10:07.861907 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:07Z is after 2026-02-23T05:33:13Z Mar 18 12:10:07 crc kubenswrapper[4921]: E0318 12:10:07.861993 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:08 crc kubenswrapper[4921]: I0318 12:10:08.137275 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:08Z is after 2026-02-23T05:33:13Z Mar 18 12:10:09 crc kubenswrapper[4921]: E0318 12:10:09.022301 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:09Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:10:09 crc kubenswrapper[4921]: I0318 12:10:09.026646 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:09 crc kubenswrapper[4921]: I0318 12:10:09.028567 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:09 crc kubenswrapper[4921]: I0318 12:10:09.028627 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:09 crc kubenswrapper[4921]: I0318 12:10:09.028652 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:09 crc kubenswrapper[4921]: I0318 12:10:09.028699 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:09 crc kubenswrapper[4921]: E0318 12:10:09.032768 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:09Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:09 crc kubenswrapper[4921]: I0318 12:10:09.135594 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:09Z is after 2026-02-23T05:33:13Z Mar 18 12:10:10 crc kubenswrapper[4921]: I0318 12:10:10.136992 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:10Z is after 2026-02-23T05:33:13Z Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.137073 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:11Z is after 2026-02-23T05:33:13Z Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.342026 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.342153 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.342244 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.342463 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.344367 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.344434 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.344455 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.345532 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ce01f847123390b2e865adfe6bee10890f7e15ca2d059d5cffa0ca04991e5d3f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 12:10:11 crc kubenswrapper[4921]: I0318 12:10:11.345822 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ce01f847123390b2e865adfe6bee10890f7e15ca2d059d5cffa0ca04991e5d3f" gracePeriod=30 Mar 18 12:10:11 crc kubenswrapper[4921]: E0318 12:10:11.946105 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.136955 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:12Z is after 2026-02-23T05:33:13Z Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.391669 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.392161 4921 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ce01f847123390b2e865adfe6bee10890f7e15ca2d059d5cffa0ca04991e5d3f" exitCode=255 Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.392224 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ce01f847123390b2e865adfe6bee10890f7e15ca2d059d5cffa0ca04991e5d3f"} Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.392266 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d84f8f73f18b8f21caf6e9ca83d5198191a69b3aec6eb330151937af4864a31"} Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.392393 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.393592 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.393644 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:12 crc kubenswrapper[4921]: I0318 12:10:12.393665 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:13 crc kubenswrapper[4921]: I0318 12:10:13.134469 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:13Z is after 2026-02-23T05:33:13Z Mar 18 12:10:14 crc kubenswrapper[4921]: I0318 12:10:14.135926 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:14Z is after 2026-02-23T05:33:13Z Mar 18 12:10:15 crc kubenswrapper[4921]: I0318 12:10:15.136749 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:15Z is after 2026-02-23T05:33:13Z Mar 18 12:10:15 crc kubenswrapper[4921]: I0318 12:10:15.209238 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4921]: I0318 12:10:15.211981 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4921]: I0318 12:10:15.212278 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4921]: I0318 12:10:15.212479 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4921]: I0318 12:10:15.213797 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:10:15 crc kubenswrapper[4921]: E0318 12:10:15.621418 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee3ea8091e46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,LastTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:16 crc kubenswrapper[4921]: E0318 12:10:16.027152 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:16Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.033350 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.034926 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.034991 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.035016 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.035058 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:16 crc kubenswrapper[4921]: E0318 12:10:16.038634 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:16Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.136270 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:16Z is after 2026-02-23T05:33:13Z Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.405024 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.407552 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b"} Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.407872 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.409494 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.409537 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:16 crc kubenswrapper[4921]: I0318 12:10:16.409549 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:16 crc kubenswrapper[4921]: W0318 12:10:16.433514 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:16Z is after 2026-02-23T05:33:13Z Mar 18 12:10:16 crc kubenswrapper[4921]: E0318 12:10:16.433601 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.136397 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:17Z is after 2026-02-23T05:33:13Z Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.412725 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.414433 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.417242 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b" exitCode=255 Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.417316 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b"} Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.417397 4921 scope.go:117] "RemoveContainer" containerID="a97b200a23d6b8fb3ea454fe9c1ed6f3eba460f42035db3765a00db4d1fcb165" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.417582 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.419166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.419232 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.419258 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4921]: I0318 12:10:17.420308 4921 scope.go:117] "RemoveContainer" containerID="c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b" Mar 18 12:10:17 crc kubenswrapper[4921]: E0318 12:10:17.420726 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.134889 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:18Z is after 2026-02-23T05:33:13Z Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.341866 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.342184 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.344300 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.344384 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.344415 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.422976 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.996085 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.996343 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.997897 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.997942 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4921]: I0318 12:10:18.997954 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.134878 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:19Z is after 2026-02-23T05:33:13Z Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.572873 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.573153 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.574602 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.574687 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.574708 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4921]: I0318 12:10:19.575841 4921 scope.go:117] "RemoveContainer" containerID="c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b" Mar 18 12:10:19 crc kubenswrapper[4921]: E0318 12:10:19.576208 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:20 crc kubenswrapper[4921]: I0318 12:10:20.140933 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:20Z is after 2026-02-23T05:33:13Z Mar 18 12:10:20 crc kubenswrapper[4921]: I0318 12:10:20.271046 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:20 crc kubenswrapper[4921]: E0318 12:10:20.280532 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:20 crc kubenswrapper[4921]: E0318 12:10:20.281775 4921 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 12:10:21 crc kubenswrapper[4921]: I0318 12:10:21.135766 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:21Z is after 2026-02-23T05:33:13Z Mar 18 12:10:21 crc kubenswrapper[4921]: W0318 12:10:21.262441 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:21Z is after 2026-02-23T05:33:13Z Mar 18 12:10:21 crc kubenswrapper[4921]: E0318 12:10:21.262574 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:21 crc kubenswrapper[4921]: W0318 12:10:21.309348 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:21Z is after 2026-02-23T05:33:13Z Mar 18 12:10:21 crc kubenswrapper[4921]: E0318 12:10:21.309526 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:21 crc kubenswrapper[4921]: I0318 12:10:21.342726 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:21 crc kubenswrapper[4921]: I0318 12:10:21.342804 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:21 crc kubenswrapper[4921]: E0318 12:10:21.946757 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.134318 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:22Z is after 2026-02-23T05:33:13Z Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.184483 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.185968 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.187572 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.187613 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.187625 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:22 crc kubenswrapper[4921]: I0318 12:10:22.188283 4921 scope.go:117] "RemoveContainer" containerID="c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b" Mar 18 12:10:22 crc kubenswrapper[4921]: E0318 12:10:22.189024 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:23 crc kubenswrapper[4921]: E0318 12:10:23.037011 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:23Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:10:23 crc kubenswrapper[4921]: I0318 12:10:23.038999 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:23 crc kubenswrapper[4921]: I0318 12:10:23.040675 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:23 crc kubenswrapper[4921]: I0318 12:10:23.040729 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:23 crc kubenswrapper[4921]: I0318 12:10:23.040746 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:23 crc kubenswrapper[4921]: I0318 12:10:23.040780 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:23 crc kubenswrapper[4921]: E0318 12:10:23.046348 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:23Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:23 crc kubenswrapper[4921]: I0318 12:10:23.136171 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:23Z is after 2026-02-23T05:33:13Z Mar 18 12:10:24 crc kubenswrapper[4921]: I0318 12:10:24.138506 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:25 crc kubenswrapper[4921]: I0318 12:10:25.136837 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.628320 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3ea8091e46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,LastTimestamp:2026-03-18 12:09:41.128396358 +0000 UTC m=+0.678317007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.635162 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.641320 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.652035 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.658546 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eb1b35d63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.290548579 +0000 UTC m=+0.840469258,LastTimestamp:2026-03-18 12:09:41.290548579 +0000 UTC m=+0.840469258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.663983 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.309998343 +0000 UTC m=+0.859919022,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.669096 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.310034734 +0000 UTC m=+0.859955413,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.674290 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd9b206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.310055025 +0000 UTC m=+0.859975704,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.679412 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.311858669 +0000 UTC m=+0.861779318,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.683872 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.311880079 +0000 UTC m=+0.861800728,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.688338 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd9b206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.311891579 +0000 UTC m=+0.861812228,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.692833 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.312849983 +0000 UTC m=+0.862770632,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.697541 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.313091809 +0000 UTC m=+0.863012468,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.702239 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd9b206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.31315599 +0000 UTC m=+0.863076639,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.707057 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.314242997 +0000 UTC m=+0.864163676,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.712311 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.314281108 +0000 UTC m=+0.864201787,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.716460 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd9b206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.314298918 +0000 UTC m=+0.864219597,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.720156 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.314664297 +0000 UTC m=+0.864584976,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.724686 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.314703708 +0000 UTC m=+0.864624367,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.729316 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd9b206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.314721098 +0000 UTC m=+0.864641757,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.732872 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.316748648 +0000 UTC m=+0.866669297,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.737041 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.316764728 +0000 UTC m=+0.866685377,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.743286 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd9b206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd9b206 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192397318 +0000 UTC m=+0.742317967,LastTimestamp:2026-03-18 12:09:41.316777488 +0000 UTC m=+0.866698147,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.750260 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd90e77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd90e77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192355447 +0000 UTC m=+0.742276106,LastTimestamp:2026-03-18 12:09:41.317278691 +0000 UTC m=+0.867199370,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.755086 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee3eabd98203\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee3eabd98203 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.192385027 +0000 UTC m=+0.742305686,LastTimestamp:2026-03-18 12:09:41.317356743 +0000 UTC m=+0.867277422,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.762354 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ece894b36 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.774330678 +0000 UTC m=+1.324251327,LastTimestamp:2026-03-18 12:09:41.774330678 +0000 UTC m=+1.324251327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.767715 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ece8cac59 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.774552153 +0000 UTC m=+1.324472832,LastTimestamp:2026-03-18 12:09:41.774552153 +0000 UTC m=+1.324472832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.772567 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3ece9ef958 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.775751512 +0000 UTC m=+1.325672191,LastTimestamp:2026-03-18 12:09:41.775751512 +0000 UTC m=+1.325672191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.776927 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3ece9fdaaa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.775809194 +0000 UTC m=+1.325729873,LastTimestamp:2026-03-18 12:09:41.775809194 +0000 UTC m=+1.325729873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.780695 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3ecec6bfc8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:41.778358216 +0000 UTC m=+1.328278865,LastTimestamp:2026-03-18 12:09:41.778358216 +0000 UTC m=+1.328278865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.785060 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ef3091703 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.386685699 +0000 UTC m=+1.936606338,LastTimestamp:2026-03-18 12:09:42.386685699 +0000 UTC m=+1.936606338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.789768 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3ef314044e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.387401806 +0000 UTC m=+1.937322445,LastTimestamp:2026-03-18 12:09:42.387401806 +0000 UTC m=+1.937322445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.793605 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ef32090a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.388224166 +0000 UTC m=+1.938144805,LastTimestamp:2026-03-18 12:09:42.388224166 +0000 UTC m=+1.938144805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.797497 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3ef3f2b708 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.401996552 +0000 UTC m=+1.951917191,LastTimestamp:2026-03-18 12:09:42.401996552 +0000 UTC m=+1.951917191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.801718 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ef5d64f46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.433689414 +0000 UTC m=+1.983610053,LastTimestamp:2026-03-18 12:09:42.433689414 +0000 UTC m=+1.983610053,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.805536 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ef5ef31be openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.435320254 +0000 UTC m=+1.985240893,LastTimestamp:2026-03-18 12:09:42.435320254 +0000 UTC m=+1.985240893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.809340 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3ef654c684 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.441977476 +0000 UTC m=+1.991898135,LastTimestamp:2026-03-18 12:09:42.441977476 +0000 UTC m=+1.991898135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.810454 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ef6b90068 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.448545896 +0000 UTC m=+1.998466535,LastTimestamp:2026-03-18 12:09:42.448545896 +0000 UTC m=+1.998466535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.812966 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3ef6c4934b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.449304395 +0000 UTC m=+1.999225044,LastTimestamp:2026-03-18 12:09:42.449304395 +0000 UTC m=+1.999225044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.814678 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3ef73b0ffc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.457069564 +0000 UTC m=+2.006990223,LastTimestamp:2026-03-18 12:09:42.457069564 +0000 UTC m=+2.006990223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.817508 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3ef8c15b3b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.482647867 +0000 UTC m=+2.032568506,LastTimestamp:2026-03-18 12:09:42.482647867 +0000 UTC m=+2.032568506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.819180 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f069cf3b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.715143089 +0000 UTC m=+2.265063778,LastTimestamp:2026-03-18 12:09:42.715143089 +0000 UTC m=+2.265063778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.830528 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f076c5323 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.728733475 +0000 UTC m=+2.278654124,LastTimestamp:2026-03-18 12:09:42.728733475 +0000 UTC m=+2.278654124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.836252 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f07827d95 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.730186133 +0000 UTC m=+2.280106782,LastTimestamp:2026-03-18 12:09:42.730186133 +0000 UTC m=+2.280106782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.840747 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f1452a660 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.945154656 +0000 UTC m=+2.495075335,LastTimestamp:2026-03-18 12:09:42.945154656 +0000 UTC m=+2.495075335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.844783 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f152e5121 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.959550753 +0000 UTC m=+2.509471422,LastTimestamp:2026-03-18 12:09:42.959550753 +0000 UTC m=+2.509471422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.848488 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f1547d3ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.961222636 +0000 UTC m=+2.511143315,LastTimestamp:2026-03-18 12:09:42.961222636 +0000 UTC m=+2.511143315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.853297 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f21e99287 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.173149319 +0000 UTC m=+2.723069958,LastTimestamp:2026-03-18 12:09:43.173149319 +0000 UTC m=+2.723069958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.857390 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f2292f44a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.18424993 +0000 UTC m=+2.734170579,LastTimestamp:2026-03-18 12:09:43.18424993 +0000 UTC m=+2.734170579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.863753 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f254a8e0c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.229836812 +0000 UTC m=+2.779757481,LastTimestamp:2026-03-18 12:09:43.229836812 +0000 UTC m=+2.779757481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.868966 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f259197ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.234492334 +0000 UTC m=+2.784413003,LastTimestamp:2026-03-18 12:09:43.234492334 +0000 UTC m=+2.784413003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.874869 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3f25f1c2d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.240794839 +0000 UTC m=+2.790715478,LastTimestamp:2026-03-18 12:09:43.240794839 +0000 UTC m=+2.790715478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.878932 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3f2609408d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.242334349 +0000 UTC m=+2.792254998,LastTimestamp:2026-03-18 12:09:43.242334349 +0000 UTC m=+2.792254998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.883067 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f31e1e940 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.441082688 +0000 UTC m=+2.991003327,LastTimestamp:2026-03-18 12:09:43.441082688 +0000 UTC m=+2.991003327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.886853 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f32de9b12 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.457643282 +0000 UTC m=+3.007563921,LastTimestamp:2026-03-18 12:09:43.457643282 +0000 UTC m=+3.007563921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.890583 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3f32e8ea47 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.458318919 +0000 UTC m=+3.008239558,LastTimestamp:2026-03-18 12:09:43.458318919 +0000 UTC m=+3.008239558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.895006 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f32e8ff87 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.458324359 +0000 UTC m=+3.008244998,LastTimestamp:2026-03-18 12:09:43.458324359 +0000 UTC m=+3.008244998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.898947 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f32ef9d49 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.458757961 +0000 UTC m=+3.008678600,LastTimestamp:2026-03-18 12:09:43.458757961 +0000 UTC m=+3.008678600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.903913 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3f33026c73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.459990643 +0000 UTC m=+3.009911282,LastTimestamp:2026-03-18 12:09:43.459990643 +0000 UTC m=+3.009911282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.908571 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f3417f881 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.478179969 +0000 UTC m=+3.028100608,LastTimestamp:2026-03-18 12:09:43.478179969 +0000 UTC m=+3.028100608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.914787 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f343359c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.479974336 +0000 UTC m=+3.029894975,LastTimestamp:2026-03-18 12:09:43.479974336 +0000 UTC m=+3.029894975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.919572 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee3f34f4849d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.492633757 +0000 UTC m=+3.042554396,LastTimestamp:2026-03-18 12:09:43.492633757 +0000 UTC m=+3.042554396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.923553 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3f35a67054 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.504293972 +0000 UTC m=+3.054214611,LastTimestamp:2026-03-18 12:09:43.504293972 +0000 UTC m=+3.054214611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.927188 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f3fef4756 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.676839766 +0000 UTC m=+3.226760405,LastTimestamp:2026-03-18 12:09:43.676839766 +0000 UTC m=+3.226760405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.932012 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f4019648e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.679599758 +0000 UTC m=+3.229520397,LastTimestamp:2026-03-18 12:09:43.679599758 +0000 UTC m=+3.229520397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.935654 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f41057905 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.695071493 +0000 UTC m=+3.244992132,LastTimestamp:2026-03-18 12:09:43.695071493 +0000 UTC m=+3.244992132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.939662 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f411b83be openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.69651603 +0000 UTC m=+3.246436659,LastTimestamp:2026-03-18 12:09:43.69651603 +0000 UTC m=+3.246436659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.943132 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f417b525d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.702794845 +0000 UTC m=+3.252715484,LastTimestamp:2026-03-18 12:09:43.702794845 +0000 UTC m=+3.252715484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.946896 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f418af4f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.703819511 +0000 UTC m=+3.253740170,LastTimestamp:2026-03-18 12:09:43.703819511 +0000 UTC m=+3.253740170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.951159 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f4ceb70e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.894692065 +0000 UTC m=+3.444612714,LastTimestamp:2026-03-18 12:09:43.894692065 +0000 UTC m=+3.444612714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.955576 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f4d03fdb1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.896300977 +0000 UTC m=+3.446221616,LastTimestamp:2026-03-18 12:09:43.896300977 +0000 UTC m=+3.446221616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.959215 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f4dfbf49e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.912551582 +0000 UTC m=+3.462472221,LastTimestamp:2026-03-18 12:09:43.912551582 +0000 UTC m=+3.462472221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.968501 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f4e1047eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.913883627 +0000 UTC m=+3.463804266,LastTimestamp:2026-03-18 12:09:43.913883627 +0000 UTC m=+3.463804266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.972723 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee3f4e1dcfa4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:43.91477034 +0000 UTC m=+3.464690979,LastTimestamp:2026-03-18 12:09:43.91477034 +0000 UTC m=+3.464690979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.976153 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f5d661ae5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.171166437 +0000 UTC m=+3.721087096,LastTimestamp:2026-03-18 12:09:44.171166437 +0000 UTC m=+3.721087096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.979735 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f5e5da63d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.187389501 +0000 UTC m=+3.737310150,LastTimestamp:2026-03-18 12:09:44.187389501 +0000 UTC m=+3.737310150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.983690 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f5e7c0dec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.189382124 +0000 UTC m=+3.739302793,LastTimestamp:2026-03-18 12:09:44.189382124 +0000 UTC m=+3.739302793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.988413 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3f633e131e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.269206302 +0000 UTC m=+3.819126941,LastTimestamp:2026-03-18 12:09:44.269206302 +0000 UTC m=+3.819126941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.992034 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f6ca099cb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.426658251 +0000 UTC m=+3.976578880,LastTimestamp:2026-03-18 12:09:44.426658251 +0000 UTC m=+3.976578880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.995800 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f6df53c8b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.448982155 +0000 UTC m=+3.998902794,LastTimestamp:2026-03-18 12:09:44.448982155 +0000 UTC m=+3.998902794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:25 crc kubenswrapper[4921]: E0318 12:10:25.998958 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3f6fc55136 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.47939615 +0000 UTC m=+4.029316789,LastTimestamp:2026-03-18 12:09:44.47939615 +0000 UTC m=+4.029316789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.003673 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3f70c26cd0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.495983824 +0000 UTC m=+4.045904473,LastTimestamp:2026-03-18 12:09:44.495983824 +0000 UTC m=+4.045904473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.009402 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fa0184405 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.290138629 +0000 UTC m=+4.840059318,LastTimestamp:2026-03-18 12:09:45.290138629 +0000 UTC m=+4.840059318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.013691 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fadc49410 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.51953512 +0000 UTC m=+5.069455759,LastTimestamp:2026-03-18 12:09:45.51953512 +0000 UTC m=+5.069455759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.018810 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fae69df95 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.530367893 +0000 UTC m=+5.080288542,LastTimestamp:2026-03-18 12:09:45.530367893 +0000 UTC m=+5.080288542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.020429 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fae7c39f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.531570675 +0000 UTC m=+5.081491344,LastTimestamp:2026-03-18 12:09:45.531570675 +0000 UTC m=+5.081491344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.027220 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fbfa599fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.819494907 +0000 UTC m=+5.369415586,LastTimestamp:2026-03-18 12:09:45.819494907 +0000 UTC m=+5.369415586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.032146 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fc3e7245b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.890899035 +0000 UTC m=+5.440819704,LastTimestamp:2026-03-18 12:09:45.890899035 +0000 UTC m=+5.440819704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.036140 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fc40ebe97 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:45.893494423 +0000 UTC m=+5.443415112,LastTimestamp:2026-03-18 12:09:45.893494423 +0000 UTC m=+5.443415112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.040017 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fd35fd315 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.150466325 +0000 UTC m=+5.700386984,LastTimestamp:2026-03-18 12:09:46.150466325 +0000 UTC m=+5.700386984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.043195 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fd7291841 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.213988417 +0000 UTC m=+5.763909066,LastTimestamp:2026-03-18 12:09:46.213988417 +0000 UTC m=+5.763909066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.047398 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fd744bcf6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.215800054 +0000 UTC m=+5.765720703,LastTimestamp:2026-03-18 12:09:46.215800054 +0000 UTC m=+5.765720703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.050853 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3fe9fd76ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.529896191 +0000 UTC m=+6.079816830,LastTimestamp:2026-03-18 12:09:46.529896191 +0000 UTC m=+6.079816830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.055616 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3feeb0ef0e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.608766734 +0000 UTC m=+6.158687393,LastTimestamp:2026-03-18 12:09:46.608766734 +0000 UTC m=+6.158687393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.059877 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3feec636bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.61016134 +0000 UTC m=+6.160081989,LastTimestamp:2026-03-18 12:09:46.61016134 +0000 UTC m=+6.160081989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.065551 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ffd5f8526 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.855089446 +0000 UTC m=+6.405010085,LastTimestamp:2026-03-18 12:09:46.855089446 +0000 UTC m=+6.405010085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.072617 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee3ffe759001 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:46.873311233 +0000 UTC m=+6.423231882,LastTimestamp:2026-03-18 12:09:46.873311233 +0000 UTC m=+6.423231882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.083312 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee4108d15dd3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:26 crc kubenswrapper[4921]: body: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:51.342067155 +0000 UTC m=+10.891987834,LastTimestamp:2026-03-18 12:09:51.342067155 +0000 UTC m=+10.891987834,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.087953 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee4108d369c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:51.342201288 +0000 UTC m=+10.892121967,LastTimestamp:2026-03-18 12:09:51.342201288 +0000 UTC m=+10.892121967,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.092075 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-apiserver-crc.189dee42084258bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 12:10:26 crc kubenswrapper[4921]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:26 crc kubenswrapper[4921]: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.627661503 +0000 UTC m=+15.177582202,LastTimestamp:2026-03-18 12:09:55.627661503 +0000 UTC m=+15.177582202,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.096918 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee4208437675 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.627734645 +0000 UTC m=+15.177655324,LastTimestamp:2026-03-18 12:09:55.627734645 +0000 UTC m=+15.177655324,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.101889 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee42084258bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-apiserver-crc.189dee42084258bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 12:10:26 crc kubenswrapper[4921]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:26 crc kubenswrapper[4921]: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.627661503 +0000 UTC m=+15.177582202,LastTimestamp:2026-03-18 12:09:55.63248336 +0000 UTC m=+15.182404029,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.105591 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee4208437675\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee4208437675 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.627734645 +0000 UTC m=+15.177655324,LastTimestamp:2026-03-18 12:09:55.632557982 +0000 UTC m=+15.182478661,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.110626 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-apiserver-crc.189dee42153aae26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 18 12:10:26 crc kubenswrapper[4921]: body: [+]ping ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]log ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]etcd ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-apiextensions-informers ok Mar 18 12:10:26 crc kubenswrapper[4921]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 18 12:10:26 crc kubenswrapper[4921]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 12:10:26 crc kubenswrapper[4921]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 12:10:26 crc kubenswrapper[4921]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/bootstrap-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-registration-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]autoregister-completion ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 12:10:26 crc kubenswrapper[4921]: livez check failed Mar 18 12:10:26 crc kubenswrapper[4921]: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.845262886 +0000 UTC m=+15.395183515,LastTimestamp:2026-03-18 12:09:55.845262886 +0000 UTC m=+15.395183515,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.114911 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee42153c0016 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:55.845349398 +0000 UTC m=+15.395270037,LastTimestamp:2026-03-18 12:09:55.845349398 +0000 UTC m=+15.395270037,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.118948 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee3f5e7c0dec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee3f5e7c0dec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:44.189382124 +0000 UTC m=+3.739302793,LastTimestamp:2026-03-18 12:09:56.335919391 +0000 UTC m=+15.885840070,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.122742 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee4108d15dd3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee4108d15dd3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:26 crc kubenswrapper[4921]: body: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:51.342067155 +0000 UTC m=+10.891987834,LastTimestamp:2026-03-18 12:10:01.945427542 +0000 UTC m=+21.495348241,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.125076 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee4108d369c8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee4108d369c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:51.342201288 +0000 UTC m=+10.892121967,LastTimestamp:2026-03-18 12:10:01.945676378 +0000 UTC m=+21.495597087,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.127105 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee45b0e9a52a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:26 crc kubenswrapper[4921]: body: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:11.342099754 +0000 UTC m=+30.892020433,LastTimestamp:2026-03-18 12:10:11.342099754 +0000 UTC m=+30.892020433,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.130995 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee45b0eb2f47 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:11.342200647 +0000 UTC m=+30.892121316,LastTimestamp:2026-03-18 12:10:11.342200647 +0000 UTC m=+30.892121316,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: I0318 12:10:26.135491 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.135581 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee45b121fb75 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:11.345791861 +0000 UTC m=+30.895712520,LastTimestamp:2026-03-18 12:10:11.345791861 +0000 UTC m=+30.895712520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.140309 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3ef5ef31be\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3ef5ef31be openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.435320254 +0000 UTC m=+1.985240893,LastTimestamp:2026-03-18 12:10:11.466669373 +0000 UTC m=+31.016590042,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.144282 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3f069cf3b1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f069cf3b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.715143089 +0000 UTC m=+2.265063778,LastTimestamp:2026-03-18 12:10:11.655539964 +0000 UTC m=+31.205460613,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.147872 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee3f076c5323\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee3f076c5323 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:09:42.728733475 +0000 UTC m=+2.278654124,LastTimestamp:2026-03-18 12:10:11.668478642 +0000 UTC m=+31.218399321,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.153279 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee45b0e9a52a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:26 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee45b0e9a52a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:26 crc kubenswrapper[4921]: body: Mar 18 12:10:26 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:11.342099754 +0000 UTC m=+30.892020433,LastTimestamp:2026-03-18 12:10:21.342780255 +0000 UTC m=+40.892700934,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:26 crc kubenswrapper[4921]: > Mar 18 12:10:26 crc kubenswrapper[4921]: E0318 12:10:26.157403 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee45b0eb2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee45b0eb2f47 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:11.342200647 +0000 UTC m=+30.892121316,LastTimestamp:2026-03-18 12:10:21.342840297 +0000 UTC m=+40.892760976,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:27 crc kubenswrapper[4921]: I0318 12:10:27.138540 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:28 crc kubenswrapper[4921]: I0318 12:10:28.135682 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:29 crc kubenswrapper[4921]: W0318 12:10:29.019072 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 12:10:29 crc kubenswrapper[4921]: E0318 12:10:29.019213 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:29 crc kubenswrapper[4921]: I0318 12:10:29.136111 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:30 crc kubenswrapper[4921]: E0318 12:10:30.043737 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.046744 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.048025 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.048080 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.048101 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.048196 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:30 crc kubenswrapper[4921]: E0318 12:10:30.052755 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.138454 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.504253 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.504588 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.505928 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.506063 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:30 crc kubenswrapper[4921]: I0318 12:10:30.506177 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:31 crc kubenswrapper[4921]: I0318 12:10:31.136134 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:31 crc kubenswrapper[4921]: I0318 12:10:31.342850 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:31 crc kubenswrapper[4921]: I0318 12:10:31.342976 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:31 crc kubenswrapper[4921]: E0318 12:10:31.349927 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee45b0e9a52a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:31 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee45b0e9a52a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:31 crc kubenswrapper[4921]: body: Mar 18 12:10:31 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:11.342099754 +0000 UTC m=+30.892020433,LastTimestamp:2026-03-18 12:10:31.342944395 +0000 UTC m=+50.892865074,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:31 crc kubenswrapper[4921]: > Mar 18 12:10:31 crc kubenswrapper[4921]: E0318 12:10:31.947219 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:32 crc kubenswrapper[4921]: I0318 12:10:32.137599 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:33 crc kubenswrapper[4921]: I0318 12:10:33.134869 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:34 crc kubenswrapper[4921]: I0318 12:10:34.136991 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:35 crc kubenswrapper[4921]: I0318 12:10:35.136507 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:36 crc kubenswrapper[4921]: I0318 12:10:36.138660 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:37 crc kubenswrapper[4921]: E0318 12:10:37.049967 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.052888 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.054366 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.054536 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.054638 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.054746 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:37 crc kubenswrapper[4921]: E0318 12:10:37.059835 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.140485 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.208559 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.210051 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.210162 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.210176 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.210849 4921 scope.go:117] "RemoveContainer" containerID="c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.481883 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.483815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15"} Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.483958 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.484720 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.484743 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:37 crc kubenswrapper[4921]: I0318 12:10:37.484753 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.138985 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.346198 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.346399 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.347650 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.347704 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.347721 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.350392 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.486632 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.487702 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.487768 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:38 crc kubenswrapper[4921]: I0318 12:10:38.487784 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.136025 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.491380 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.492079 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.494168 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" exitCode=255 Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.494212 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15"} Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.494250 4921 scope.go:117] "RemoveContainer" containerID="c45a41ab8486074aa6e367e489dab1023e3fd359e5271cfef33e187f3809bc9b" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.494388 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.495228 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.495263 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.495276 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.495806 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:10:39 crc kubenswrapper[4921]: E0318 12:10:39.495980 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:39 crc kubenswrapper[4921]: I0318 12:10:39.573223 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.135138 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.509430 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.511548 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.513035 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.513086 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.513099 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:40 crc kubenswrapper[4921]: I0318 12:10:40.514011 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:10:40 crc kubenswrapper[4921]: E0318 12:10:40.514259 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:41 crc kubenswrapper[4921]: I0318 12:10:41.135259 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:41 crc kubenswrapper[4921]: E0318 12:10:41.947819 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.135360 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.184927 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.185141 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.186274 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.186301 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.186313 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:42 crc kubenswrapper[4921]: I0318 12:10:42.186794 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:10:42 crc kubenswrapper[4921]: E0318 12:10:42.187008 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:42 crc kubenswrapper[4921]: W0318 12:10:42.614978 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 12:10:42 crc kubenswrapper[4921]: E0318 12:10:42.615065 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:10:43 crc kubenswrapper[4921]: I0318 12:10:43.135774 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:44 crc kubenswrapper[4921]: E0318 12:10:44.058424 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:44 crc kubenswrapper[4921]: I0318 12:10:44.060620 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:44 crc kubenswrapper[4921]: I0318 12:10:44.062105 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:44 crc kubenswrapper[4921]: I0318 12:10:44.062169 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:44 crc kubenswrapper[4921]: I0318 12:10:44.062182 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:44 crc kubenswrapper[4921]: I0318 12:10:44.062222 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:44 crc kubenswrapper[4921]: E0318 12:10:44.068909 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:44 crc kubenswrapper[4921]: I0318 12:10:44.138433 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:45 crc kubenswrapper[4921]: I0318 12:10:45.138556 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:46 crc kubenswrapper[4921]: I0318 12:10:46.132494 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:47 crc kubenswrapper[4921]: I0318 12:10:47.136405 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:48 crc kubenswrapper[4921]: I0318 12:10:48.135671 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:49 crc kubenswrapper[4921]: I0318 12:10:49.136708 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:50 crc kubenswrapper[4921]: I0318 12:10:50.138234 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:51 crc kubenswrapper[4921]: E0318 12:10:51.066377 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:51 crc kubenswrapper[4921]: I0318 12:10:51.069338 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:51 crc kubenswrapper[4921]: I0318 12:10:51.070784 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:51 crc kubenswrapper[4921]: I0318 12:10:51.070823 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:51 crc kubenswrapper[4921]: I0318 12:10:51.070838 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:51 crc kubenswrapper[4921]: I0318 12:10:51.070868 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:51 crc kubenswrapper[4921]: E0318 12:10:51.074884 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:51 crc kubenswrapper[4921]: I0318 12:10:51.135795 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:51 crc kubenswrapper[4921]: E0318 12:10:51.949021 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:52 crc kubenswrapper[4921]: I0318 12:10:52.136254 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:52 crc kubenswrapper[4921]: I0318 12:10:52.283102 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:52 crc kubenswrapper[4921]: I0318 12:10:52.299845 4921 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:10:53 crc kubenswrapper[4921]: I0318 12:10:53.136412 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:54 crc kubenswrapper[4921]: I0318 12:10:54.138297 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:55 crc kubenswrapper[4921]: I0318 12:10:55.134975 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:55 crc kubenswrapper[4921]: I0318 12:10:55.208551 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:55 crc kubenswrapper[4921]: I0318 12:10:55.210041 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:55 crc kubenswrapper[4921]: I0318 12:10:55.210099 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:55 crc kubenswrapper[4921]: I0318 12:10:55.210173 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:55 crc kubenswrapper[4921]: I0318 12:10:55.211195 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:10:55 crc kubenswrapper[4921]: E0318 12:10:55.211483 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:56 crc kubenswrapper[4921]: I0318 12:10:56.137246 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:56 crc kubenswrapper[4921]: I0318 12:10:56.185354 4921 csr.go:261] certificate signing request csr-dbz5c is approved, waiting to be issued Mar 18 12:10:56 crc kubenswrapper[4921]: I0318 12:10:56.200826 4921 csr.go:257] certificate signing request csr-dbz5c is issued Mar 18 12:10:56 crc kubenswrapper[4921]: I0318 12:10:56.240666 4921 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 12:10:56 crc kubenswrapper[4921]: I0318 12:10:56.963781 4921 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 12:10:57 crc kubenswrapper[4921]: I0318 12:10:57.202722 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 09:49:44.501718503 +0000 UTC Mar 18 12:10:57 crc kubenswrapper[4921]: I0318 12:10:57.202816 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5949h38m47.2989084s for next certificate rotation Mar 18 12:10:57 crc kubenswrapper[4921]: I0318 12:10:57.208148 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:57 crc kubenswrapper[4921]: I0318 12:10:57.210754 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:57 crc kubenswrapper[4921]: I0318 12:10:57.210798 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:57 crc kubenswrapper[4921]: I0318 12:10:57.210814 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.075733 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.077533 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.077582 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.077600 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.077751 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.087883 4921 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.088407 4921 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.088464 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.092412 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.092462 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.092493 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.092531 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.092557 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.114770 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.124728 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.124937 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.125097 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.125360 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.125612 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.142532 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.154670 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.154701 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.154711 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.154726 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.154737 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.168510 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.179953 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.180019 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.180043 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.180070 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.180094 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:10:58Z","lastTransitionTime":"2026-03-18T12:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.195809 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.196070 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.196141 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.297234 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: I0318 12:10:58.350333 4921 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.397398 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.497890 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.598147 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.698330 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.799039 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:58 crc kubenswrapper[4921]: E0318 12:10:58.900209 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.001268 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.101345 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.202432 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.302920 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.403078 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.503959 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.604523 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.705325 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.805638 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:59 crc kubenswrapper[4921]: E0318 12:10:59.906905 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.007901 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.108736 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.208912 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.309333 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.409886 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.510970 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.611161 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.711940 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.813039 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:00 crc kubenswrapper[4921]: E0318 12:11:00.914096 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.015307 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.115729 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.215981 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.318712 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.419915 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.520308 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.620959 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.722001 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.822359 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.923543 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:01 crc kubenswrapper[4921]: E0318 12:11:01.949268 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.024299 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.124926 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.225515 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.326535 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.426786 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.527300 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.627641 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.728709 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.829380 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:02 crc kubenswrapper[4921]: E0318 12:11:02.930312 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.031047 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.131205 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.231327 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.332229 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.433370 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.533534 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.634337 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.735297 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.835497 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:03 crc kubenswrapper[4921]: E0318 12:11:03.936274 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.037177 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.137653 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.238664 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.339791 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.440824 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.541935 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.643137 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.744270 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.845390 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:04 crc kubenswrapper[4921]: E0318 12:11:04.945581 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.046428 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.146961 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.247100 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.347370 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.447948 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.549204 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.650213 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.750899 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.851748 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4921]: E0318 12:11:05.952903 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.053800 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.155070 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.257037 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.357199 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.458541 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.558853 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.660008 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.761197 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.862520 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:06 crc kubenswrapper[4921]: E0318 12:11:06.963658 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.064010 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.164959 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.265913 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.366357 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.466862 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.567268 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.667776 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.767957 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.868372 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:07 crc kubenswrapper[4921]: I0318 12:11:07.931977 4921 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:07 crc kubenswrapper[4921]: E0318 12:11:07.968513 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.068976 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.169598 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.208846 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.210402 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.210438 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.210449 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.260872 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.267243 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.267305 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.267317 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.267340 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.267357 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.280481 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.286669 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.286717 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.286737 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.286775 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.286796 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.305090 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.310559 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.310615 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.310634 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.310664 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.310683 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.324283 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.330281 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.330327 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.330339 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.330354 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:08 crc kubenswrapper[4921]: I0318 12:11:08.330364 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:08Z","lastTransitionTime":"2026-03-18T12:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.346970 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.347100 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.347154 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.448178 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.548663 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.649401 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.749780 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.850169 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:08 crc kubenswrapper[4921]: E0318 12:11:08.950879 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.051969 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.152072 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.252321 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.353293 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.454282 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.555192 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: E0318 12:11:09.656092 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.704762 4921 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.759042 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.759152 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.759172 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.759200 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.759218 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.862645 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.862740 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.862766 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.862808 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.862837 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.965733 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.965773 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.965783 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.965798 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:09 crc kubenswrapper[4921]: I0318 12:11:09.965808 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:09Z","lastTransitionTime":"2026-03-18T12:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.068791 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.068862 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.068871 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.068885 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.068897 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.158792 4921 apiserver.go:52] "Watching apiserver" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.164764 4921 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.165151 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.165791 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.165916 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.166021 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.166072 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.166026 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.166177 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.166040 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.166224 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.170184 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.170404 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.170560 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.173084 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.174003 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.174026 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.173814 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.174013 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.174589 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.174696 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.179841 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.179888 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.179908 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.179940 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.179963 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.200713 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.213419 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.220659 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.220893 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.220979 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.224983 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.233280 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.234866 4921 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.240984 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.251634 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.259712 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.283053 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.283308 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.283390 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.283483 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.283570 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333261 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333310 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333338 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333358 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333377 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333415 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333476 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333499 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333517 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333537 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333559 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333579 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333600 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333620 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333643 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333665 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333688 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333712 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333739 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333762 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333809 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333830 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333853 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333872 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333893 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333915 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333939 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333963 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.333984 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334005 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334027 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334049 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334076 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334095 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334132 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334153 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334176 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334197 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334218 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334240 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334265 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334314 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334337 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334359 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334385 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334407 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334431 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334454 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334477 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334500 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334524 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334549 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334574 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334596 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334620 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334643 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334687 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334709 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334731 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334755 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334776 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334798 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334821 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334844 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334873 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334920 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334867 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334945 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334968 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334995 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335021 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335052 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335075 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335101 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335139 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335161 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335184 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335212 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335237 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335262 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335289 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335313 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335334 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335357 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335376 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335398 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335427 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335457 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335483 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335507 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335534 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335559 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335585 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335610 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335635 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335660 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335683 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335705 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335727 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335752 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335775 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335799 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335825 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335850 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335875 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335901 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335924 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335948 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336003 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336030 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336054 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336079 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336121 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336175 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336199 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336224 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336248 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336278 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336304 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336329 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336352 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336376 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336400 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336423 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336449 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336472 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336495 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336518 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336540 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336564 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336589 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336613 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336643 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336669 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336693 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336722 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336746 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336770 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336795 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336820 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336846 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336869 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336922 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336947 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336971 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336999 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337026 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337052 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337105 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337157 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337183 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337232 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337257 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337307 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337333 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337360 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337387 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337439 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337467 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337493 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337518 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337545 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337572 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337597 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337624 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337649 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337678 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337711 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337738 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337764 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337789 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337814 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337840 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337865 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337891 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337916 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337973 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338064 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338093 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338210 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338240 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338265 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338313 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338339 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338386 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338418 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338450 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338503 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338531 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338558 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338589 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338615 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338667 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338736 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338761 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338807 4921 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.334911 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335076 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.346184 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335405 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335426 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335482 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335579 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335662 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335863 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335950 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.335278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336102 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.346260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336341 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336410 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336680 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336799 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336801 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336875 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336947 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336977 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337021 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337075 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337173 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337265 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337192 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337424 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337517 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337446 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337608 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337701 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337755 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.337857 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338318 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.338569 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.339820 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.340218 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.340229 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.340610 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.340633 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.340666 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.340386 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.339756 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.341284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.341577 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.341722 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.341743 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.341786 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.342020 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.342252 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.342357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.342311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.342596 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.342644 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.343589 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.343894 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.343926 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.344353 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.344363 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.344384 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.344840 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.344913 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.345010 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.345297 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.344942 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.345551 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.345560 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.345678 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.345691 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.346851 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.346974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347063 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347469 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347464 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.336375 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347824 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347975 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.347972 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348286 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348289 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348399 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348413 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.348868 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.349350 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.349369 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.349473 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.349706 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.349845 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350005 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350103 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350222 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350363 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350402 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350515 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350544 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350554 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.350624 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351021 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351066 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351292 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351570 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351651 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351551 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351788 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351872 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352054 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352087 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352149 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352338 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352372 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352363 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351358 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352562 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352742 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.352740 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352588 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.352992 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.351368 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.353011 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.852900202 +0000 UTC m=+90.402821061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.353011 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.353178 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.353299 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.354389 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.355291 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.353648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.353685 4921 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.366152 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.366854 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.866823834 +0000 UTC m=+90.416744683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.367940 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.867920343 +0000 UTC m=+90.417841052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.372661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.373013 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.373185 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.382145 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.382183 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.382570 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.382638 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.382831 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.382883 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.382955 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.382966 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.882941204 +0000 UTC m=+90.432861873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.383068 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.383436 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.383498 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.383504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.384872 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.386231 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.386627 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.386664 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.386714 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387087 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387423 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.387520 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.387547 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387712 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387891 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.387983 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.388098 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.388203 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.388197 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.388767 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.389312 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.389517 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:10.889170601 +0000 UTC m=+90.439091300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.391405 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.391624 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.391966 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.391990 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.392002 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.394547 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.396298 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.398071 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.400334 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.400995 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.401258 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.402412 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.402482 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.402691 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.403071 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.403189 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.403849 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.403932 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.404202 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.405610 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.405758 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406212 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406312 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406365 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406380 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406688 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406739 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406856 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406921 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.406924 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.407126 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.407129 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.407605 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.408393 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.408878 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.408959 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.410768 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.410877 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411032 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411321 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411124 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411154 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411190 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411206 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411308 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411406 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.411497 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.416100 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.421324 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440392 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440437 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440529 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440671 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440717 4921 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440746 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440772 4921 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440798 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440822 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440847 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440873 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440897 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440922 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440947 4921 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.440973 4921 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441000 4921 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441024 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441048 4921 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441074 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441103 4921 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441160 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441180 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441197 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441215 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441232 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441249 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441267 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441284 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441300 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441322 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441341 4921 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441357 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441378 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441416 4921 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441446 4921 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441472 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441498 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441523 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441553 4921 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441582 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441608 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441628 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441647 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441665 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441681 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441697 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441713 4921 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441729 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441747 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441765 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441781 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441798 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441815 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441832 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441849 4921 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441865 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441881 4921 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441897 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441914 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441931 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441948 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441967 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.441984 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442001 4921 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442023 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442058 4921 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442095 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442147 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442165 4921 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442182 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442202 4921 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442218 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442235 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442253 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442270 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442289 4921 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442305 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442322 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442338 4921 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442356 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442373 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442389 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442405 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442422 4921 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442440 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442457 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442477 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442495 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442512 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442531 4921 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442547 4921 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442564 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442581 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442682 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442722 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442748 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442773 4921 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442798 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442822 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442847 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442865 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442885 4921 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442903 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442922 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442942 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442960 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442977 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.442994 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443012 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443044 4921 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443079 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443104 4921 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443166 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443191 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443216 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443241 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443263 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443285 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443310 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443332 4921 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443355 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443383 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443407 4921 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443431 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443456 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443480 4921 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443503 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443526 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443548 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443565 4921 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443582 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443601 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443617 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443671 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443694 4921 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443711 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443727 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443747 4921 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443770 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443793 4921 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443815 4921 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443837 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443862 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443885 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443909 4921 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443934 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443959 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.443986 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444008 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444032 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444055 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444077 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444099 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444154 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444170 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444187 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444203 4921 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444219 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444235 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444252 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444269 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444288 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444305 4921 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444322 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444338 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444354 4921 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444372 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444389 4921 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444405 4921 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444422 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444440 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444456 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444471 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444489 4921 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444507 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444523 4921 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444539 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444556 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444574 4921 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444589 4921 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444606 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444622 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444637 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444653 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444670 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444686 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444704 4921 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444724 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444742 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444759 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444778 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444794 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444811 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444831 4921 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.444848 4921 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.448537 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.495631 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.495700 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.495724 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.495753 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.495772 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.497015 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.515952 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:10 crc kubenswrapper[4921]: W0318 12:11:10.518210 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ce8d6b041238214b8049948bfbec7b5ef738b2580360d374beb6ad9dae8b5117 WatchSource:0}: Error finding container ce8d6b041238214b8049948bfbec7b5ef738b2580360d374beb6ad9dae8b5117: Status 404 returned error can't find the container with id ce8d6b041238214b8049948bfbec7b5ef738b2580360d374beb6ad9dae8b5117 Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.521075 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:10 crc kubenswrapper[4921]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:10 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:10 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:10 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: fi Mar 18 12:11:10 crc kubenswrapper[4921]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:10 crc kubenswrapper[4921]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:10 crc kubenswrapper[4921]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:10 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:10 crc kubenswrapper[4921]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:10 crc kubenswrapper[4921]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:10 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:10 crc kubenswrapper[4921]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --webhook-host=127.0.0.1 \ Mar 18 12:11:10 crc kubenswrapper[4921]: --webhook-port=9743 \ Mar 18 12:11:10 crc kubenswrapper[4921]: ${ho_enable} \ Mar 18 12:11:10 crc kubenswrapper[4921]: --enable-interconnect \ Mar 18 12:11:10 crc kubenswrapper[4921]: --disable-approver \ Mar 18 12:11:10 crc kubenswrapper[4921]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:10 crc kubenswrapper[4921]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:10 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:10 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.523705 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:10 crc kubenswrapper[4921]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:10 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:10 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:10 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: fi Mar 18 12:11:10 crc kubenswrapper[4921]: Mar 18 12:11:10 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:10 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:10 crc kubenswrapper[4921]: --disable-webhook \ Mar 18 12:11:10 crc kubenswrapper[4921]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:10 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:10 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.524973 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.528280 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:10 crc kubenswrapper[4921]: W0318 12:11:10.531144 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-e7f54493bdb93551a6d76f6efc46df0581ece6d21a0c3cd91cbea340c5b7c89a WatchSource:0}: Error finding container e7f54493bdb93551a6d76f6efc46df0581ece6d21a0c3cd91cbea340c5b7c89a: Status 404 returned error can't find the container with id e7f54493bdb93551a6d76f6efc46df0581ece6d21a0c3cd91cbea340c5b7c89a Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.537493 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.541488 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:10 crc kubenswrapper[4921]: W0318 12:11:10.542946 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-eb938b6f8b6cd2e8383572ebc53ac83ffc2b972730cd246b108acb5bd1beec05 WatchSource:0}: Error finding container eb938b6f8b6cd2e8383572ebc53ac83ffc2b972730cd246b108acb5bd1beec05: Status 404 returned error can't find the container with id eb938b6f8b6cd2e8383572ebc53ac83ffc2b972730cd246b108acb5bd1beec05 Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.545810 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.546437 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:10 crc kubenswrapper[4921]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:10 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:10 crc kubenswrapper[4921]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:10 crc kubenswrapper[4921]: else Mar 18 12:11:10 crc kubenswrapper[4921]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:10 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:10 crc kubenswrapper[4921]: fi Mar 18 12:11:10 crc kubenswrapper[4921]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:10 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:10 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.547951 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.599605 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.599679 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.599695 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.599717 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.599732 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.609499 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eb938b6f8b6cd2e8383572ebc53ac83ffc2b972730cd246b108acb5bd1beec05"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.610872 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e7f54493bdb93551a6d76f6efc46df0581ece6d21a0c3cd91cbea340c5b7c89a"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.614427 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ce8d6b041238214b8049948bfbec7b5ef738b2580360d374beb6ad9dae8b5117"} Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.614674 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.615162 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:10 crc kubenswrapper[4921]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:10 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:10 crc kubenswrapper[4921]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:10 crc kubenswrapper[4921]: else Mar 18 12:11:10 crc kubenswrapper[4921]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:10 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:10 crc kubenswrapper[4921]: fi Mar 18 12:11:10 crc kubenswrapper[4921]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:10 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:10 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.615418 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.615749 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:10 crc kubenswrapper[4921]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:10 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:10 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:10 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: fi Mar 18 12:11:10 crc kubenswrapper[4921]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:10 crc kubenswrapper[4921]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:10 crc kubenswrapper[4921]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:10 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:10 crc kubenswrapper[4921]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:10 crc kubenswrapper[4921]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:10 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:10 crc kubenswrapper[4921]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --webhook-host=127.0.0.1 \ Mar 18 12:11:10 crc kubenswrapper[4921]: --webhook-port=9743 \ Mar 18 12:11:10 crc kubenswrapper[4921]: ${ho_enable} \ Mar 18 12:11:10 crc kubenswrapper[4921]: --enable-interconnect \ Mar 18 12:11:10 crc kubenswrapper[4921]: --disable-approver \ Mar 18 12:11:10 crc kubenswrapper[4921]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:10 crc kubenswrapper[4921]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:10 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:10 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.616192 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.616598 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.616725 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.617651 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:10 crc kubenswrapper[4921]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:10 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:10 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:10 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:10 crc kubenswrapper[4921]: fi Mar 18 12:11:10 crc kubenswrapper[4921]: Mar 18 12:11:10 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:10 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:10 crc kubenswrapper[4921]: --disable-webhook \ Mar 18 12:11:10 crc kubenswrapper[4921]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:10 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:10 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:10 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.619697 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.624718 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.638477 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.651999 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.661165 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.675846 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.684798 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.694971 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.702139 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.702182 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.702195 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.702228 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.702239 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.704565 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.714962 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.722576 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.733264 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.745427 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.757010 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.767395 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.804371 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.804417 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.804435 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.804459 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.804476 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.907954 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.908016 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.908027 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.908050 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.908062 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:10Z","lastTransitionTime":"2026-03-18T12:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.948806 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.948890 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.948928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.948969 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:10 crc kubenswrapper[4921]: I0318 12:11:10.949005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949173 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949243 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:11.949221966 +0000 UTC m=+91.499142635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949411 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949547 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:11.949516304 +0000 UTC m=+91.499436993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949684 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:11.949671118 +0000 UTC m=+91.499591797 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949706 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949733 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949758 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.949817 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:11.949802001 +0000 UTC m=+91.499722670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.950515 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.950705 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.950848 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:10 crc kubenswrapper[4921]: E0318 12:11:10.951296 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:11.951266341 +0000 UTC m=+91.501187010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.011761 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.011890 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.011910 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.011938 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.011988 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.114541 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.114574 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.114586 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.114602 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.114612 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.212587 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.213077 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.214327 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.214916 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216012 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216787 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216863 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216886 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216898 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216913 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.216924 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.217564 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.218468 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.219047 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.219924 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.220398 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.221422 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.221897 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.222393 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.223257 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.223811 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.224688 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.225042 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.225439 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.225701 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.226609 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.227020 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.227931 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.228386 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.229392 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.229787 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.230362 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.231362 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.231797 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.232672 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.233092 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.234281 4921 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.234411 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.236491 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.237603 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.238145 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.240103 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.240912 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.241402 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.242027 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.242961 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.244254 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.244836 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.246062 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.247057 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.248289 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.248897 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.250024 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.250811 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.253064 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.254435 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.256452 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.257971 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.258547 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.260657 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.262371 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.263692 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.271157 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.285016 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.303269 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.322460 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.322551 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.322578 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.322611 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.322636 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.323815 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.425576 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.425639 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.425657 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.425684 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.425704 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.529151 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.529266 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.529284 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.529309 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.529328 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.632631 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.632675 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.632686 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.632705 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.632717 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.736008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.736055 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.736067 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.736085 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.736098 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.839848 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.839922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.839940 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.839970 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.839996 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.943600 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.943656 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.943665 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.943680 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.943689 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:11Z","lastTransitionTime":"2026-03-18T12:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.958170 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.958330 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958498 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:13.958436243 +0000 UTC m=+93.508356932 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.958612 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958505 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.958699 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958753 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958771 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:13.958737771 +0000 UTC m=+93.508658440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958812 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: I0318 12:11:11.958817 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958827 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958870 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958874 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958880 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958889 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958830 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:13.958804363 +0000 UTC m=+93.508725042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958929 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:13.958917976 +0000 UTC m=+93.508838615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:11 crc kubenswrapper[4921]: E0318 12:11:11.958942 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:13.958937106 +0000 UTC m=+93.508857745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.046431 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.046768 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.046954 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.047101 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.047270 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.150218 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.150261 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.150273 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.150290 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.150302 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.208239 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.208410 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:12 crc kubenswrapper[4921]: E0318 12:11:12.208525 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.208702 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:12 crc kubenswrapper[4921]: E0318 12:11:12.208744 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:12 crc kubenswrapper[4921]: E0318 12:11:12.208981 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.253308 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.253401 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.253448 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.253475 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.253491 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.356481 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.357737 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.357967 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.358238 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.358473 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.461379 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.461430 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.461444 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.461461 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.461472 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.564775 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.564858 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.564880 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.564901 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.564915 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.667723 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.667821 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.667849 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.667887 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.667911 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.770270 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.770318 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.770330 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.770348 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.770360 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.873978 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.874061 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.874078 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.874127 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.874137 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.980672 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.980729 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.980738 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.980752 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:12 crc kubenswrapper[4921]: I0318 12:11:12.980760 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:12Z","lastTransitionTime":"2026-03-18T12:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.084034 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.084085 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.084095 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.084135 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.084148 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.187199 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.187268 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.187285 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.187308 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.187326 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.290849 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.290917 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.290930 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.290955 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.290970 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.394279 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.394355 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.394372 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.394394 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.394411 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.497618 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.497720 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.497742 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.497774 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.497798 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.601228 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.601265 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.601277 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.601295 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.601306 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.704331 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.704405 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.704415 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.704434 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.704446 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.807630 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.808008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.808285 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.808507 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.808743 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.912400 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.912471 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.912496 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.912525 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.912549 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:13Z","lastTransitionTime":"2026-03-18T12:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.977421 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.977531 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.977572 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.977619 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:13 crc kubenswrapper[4921]: I0318 12:11:13.977658 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.977792 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:17.977744229 +0000 UTC m=+97.527664908 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.977800 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.977846 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.977914 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:17.977892623 +0000 UTC m=+97.527813302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.977939 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:17.977926674 +0000 UTC m=+97.527847353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978028 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978047 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978065 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978088 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978107 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:17.978094719 +0000 UTC m=+97.528015398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978197 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978225 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:13 crc kubenswrapper[4921]: E0318 12:11:13.978294 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:17.978272334 +0000 UTC m=+97.528193053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.015507 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.015588 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.015607 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.015629 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.015644 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.119513 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.119943 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.120319 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.120795 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.121149 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.208721 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.208749 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.208875 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:14 crc kubenswrapper[4921]: E0318 12:11:14.208966 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:14 crc kubenswrapper[4921]: E0318 12:11:14.209073 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:14 crc kubenswrapper[4921]: E0318 12:11:14.209228 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.225315 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.225410 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.225426 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.225448 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.225463 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.328485 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.328547 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.328561 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.328590 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.328605 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.432057 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.432137 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.432163 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.432185 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.432200 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.535668 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.535720 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.535741 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.535764 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.535782 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.638614 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.638662 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.638673 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.638686 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.638695 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.741414 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.741530 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.741550 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.741579 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.741597 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.844770 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.844884 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.844917 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.844948 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.844977 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.948796 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.948862 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.948871 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.948885 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:14 crc kubenswrapper[4921]: I0318 12:11:14.948894 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:14Z","lastTransitionTime":"2026-03-18T12:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.051738 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.051774 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.051785 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.051798 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.051812 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.154234 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.154296 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.154319 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.154339 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.154352 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.257603 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.257639 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.257646 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.257660 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.257668 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.360948 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.361008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.361025 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.361048 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.361063 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.464276 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.464332 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.464350 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.464375 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.464392 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.567402 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.567441 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.567452 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.567468 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.567479 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.671201 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.671265 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.671283 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.671361 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.671380 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.775281 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.775359 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.775384 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.775414 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.775436 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.877863 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.877918 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.877936 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.877960 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.877976 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.980795 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.980858 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.980873 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.980898 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:15 crc kubenswrapper[4921]: I0318 12:11:15.980914 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:15Z","lastTransitionTime":"2026-03-18T12:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.083832 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.083894 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.083911 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.083936 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.083953 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.191086 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.191194 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.191218 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.191252 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.191284 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.208433 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.208495 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.208434 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:16 crc kubenswrapper[4921]: E0318 12:11:16.208611 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:16 crc kubenswrapper[4921]: E0318 12:11:16.208746 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:16 crc kubenswrapper[4921]: E0318 12:11:16.208881 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.294493 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.294572 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.294590 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.294610 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.294624 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.398063 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.398154 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.398178 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.398203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.398223 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.401657 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2ms8h"] Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.402220 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.404374 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.404770 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.406501 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.414596 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.434820 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.450536 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.462945 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.476390 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.489599 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.501138 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.501188 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.501198 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.501220 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.501233 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.501984 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.502599 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx5k7\" (UniqueName: \"kubernetes.io/projected/576db9cc-6447-4ab5-b6d4-b9b68e48167e-kube-api-access-fx5k7\") pod \"node-resolver-2ms8h\" (UID: \"576db9cc-6447-4ab5-b6d4-b9b68e48167e\") " pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.502653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/576db9cc-6447-4ab5-b6d4-b9b68e48167e-hosts-file\") pod \"node-resolver-2ms8h\" (UID: \"576db9cc-6447-4ab5-b6d4-b9b68e48167e\") " pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.515553 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.603242 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx5k7\" (UniqueName: \"kubernetes.io/projected/576db9cc-6447-4ab5-b6d4-b9b68e48167e-kube-api-access-fx5k7\") pod \"node-resolver-2ms8h\" (UID: \"576db9cc-6447-4ab5-b6d4-b9b68e48167e\") " pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.603313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/576db9cc-6447-4ab5-b6d4-b9b68e48167e-hosts-file\") pod \"node-resolver-2ms8h\" (UID: \"576db9cc-6447-4ab5-b6d4-b9b68e48167e\") " pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.603429 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/576db9cc-6447-4ab5-b6d4-b9b68e48167e-hosts-file\") pod \"node-resolver-2ms8h\" (UID: \"576db9cc-6447-4ab5-b6d4-b9b68e48167e\") " pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.604922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.605031 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.605129 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.605233 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.605299 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.634784 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx5k7\" (UniqueName: \"kubernetes.io/projected/576db9cc-6447-4ab5-b6d4-b9b68e48167e-kube-api-access-fx5k7\") pod \"node-resolver-2ms8h\" (UID: \"576db9cc-6447-4ab5-b6d4-b9b68e48167e\") " pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.708563 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.708642 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.708655 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.708673 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.708685 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.723875 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ms8h" Mar 18 12:11:16 crc kubenswrapper[4921]: W0318 12:11:16.743772 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576db9cc_6447_4ab5_b6d4_b9b68e48167e.slice/crio-a7b90ee66132eded595c8ffc4b6a76298d8d6f798793f118644cf7e25a2373b3 WatchSource:0}: Error finding container a7b90ee66132eded595c8ffc4b6a76298d8d6f798793f118644cf7e25a2373b3: Status 404 returned error can't find the container with id a7b90ee66132eded595c8ffc4b6a76298d8d6f798793f118644cf7e25a2373b3 Mar 18 12:11:16 crc kubenswrapper[4921]: E0318 12:11:16.747624 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:16 crc kubenswrapper[4921]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:16 crc kubenswrapper[4921]: set -uo pipefail Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 12:11:16 crc kubenswrapper[4921]: HOSTS_FILE="/etc/hosts" Mar 18 12:11:16 crc kubenswrapper[4921]: TEMP_FILE="/etc/hosts.tmp" Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: # Make a temporary file with the old hosts file's attributes. Mar 18 12:11:16 crc kubenswrapper[4921]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 12:11:16 crc kubenswrapper[4921]: echo "Failed to preserve hosts file. Exiting." Mar 18 12:11:16 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:16 crc kubenswrapper[4921]: fi Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: while true; do Mar 18 12:11:16 crc kubenswrapper[4921]: declare -A svc_ips Mar 18 12:11:16 crc kubenswrapper[4921]: for svc in "${services[@]}"; do Mar 18 12:11:16 crc kubenswrapper[4921]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 12:11:16 crc kubenswrapper[4921]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 12:11:16 crc kubenswrapper[4921]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 12:11:16 crc kubenswrapper[4921]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 12:11:16 crc kubenswrapper[4921]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:16 crc kubenswrapper[4921]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:16 crc kubenswrapper[4921]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:16 crc kubenswrapper[4921]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 12:11:16 crc kubenswrapper[4921]: for i in ${!cmds[*]} Mar 18 12:11:16 crc kubenswrapper[4921]: do Mar 18 12:11:16 crc kubenswrapper[4921]: ips=($(eval "${cmds[i]}")) Mar 18 12:11:16 crc kubenswrapper[4921]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 12:11:16 crc kubenswrapper[4921]: svc_ips["${svc}"]="${ips[@]}" Mar 18 12:11:16 crc kubenswrapper[4921]: break Mar 18 12:11:16 crc kubenswrapper[4921]: fi Mar 18 12:11:16 crc kubenswrapper[4921]: done Mar 18 12:11:16 crc kubenswrapper[4921]: done Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: # Update /etc/hosts only if we get valid service IPs Mar 18 12:11:16 crc kubenswrapper[4921]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 12:11:16 crc kubenswrapper[4921]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 12:11:16 crc kubenswrapper[4921]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 12:11:16 crc kubenswrapper[4921]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 12:11:16 crc kubenswrapper[4921]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 12:11:16 crc kubenswrapper[4921]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 12:11:16 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:16 crc kubenswrapper[4921]: continue Mar 18 12:11:16 crc kubenswrapper[4921]: fi Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: # Append resolver entries for services Mar 18 12:11:16 crc kubenswrapper[4921]: rc=0 Mar 18 12:11:16 crc kubenswrapper[4921]: for svc in "${!svc_ips[@]}"; do Mar 18 12:11:16 crc kubenswrapper[4921]: for ip in ${svc_ips[${svc}]}; do Mar 18 12:11:16 crc kubenswrapper[4921]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 12:11:16 crc kubenswrapper[4921]: done Mar 18 12:11:16 crc kubenswrapper[4921]: done Mar 18 12:11:16 crc kubenswrapper[4921]: if [[ $rc -ne 0 ]]; then Mar 18 12:11:16 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:16 crc kubenswrapper[4921]: continue Mar 18 12:11:16 crc kubenswrapper[4921]: fi Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: Mar 18 12:11:16 crc kubenswrapper[4921]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 12:11:16 crc kubenswrapper[4921]: # Replace /etc/hosts with our modified version if needed Mar 18 12:11:16 crc kubenswrapper[4921]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 12:11:16 crc kubenswrapper[4921]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 12:11:16 crc kubenswrapper[4921]: fi Mar 18 12:11:16 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:16 crc kubenswrapper[4921]: unset svc_ips Mar 18 12:11:16 crc kubenswrapper[4921]: done Mar 18 12:11:16 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fx5k7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2ms8h_openshift-dns(576db9cc-6447-4ab5-b6d4-b9b68e48167e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:16 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:16 crc kubenswrapper[4921]: E0318 12:11:16.748880 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2ms8h" podUID="576db9cc-6447-4ab5-b6d4-b9b68e48167e" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.757999 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fsfj7"] Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.758385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.760982 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.761168 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.761294 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.761289 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gkdzx"] Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.761411 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.761556 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.762825 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-46nj9"] Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.763396 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.764248 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.765336 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.766342 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.766924 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.767181 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.767330 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.767582 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.767746 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.777103 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.794464 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804583 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/509553d8-b894-456c-a45e-665e8497cdbc-mcd-auth-proxy-config\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804636 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-kubelet\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804658 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-etc-kubernetes\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804710 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-system-cni-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804728 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cnibin\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804752 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2dh7\" (UniqueName: \"kubernetes.io/projected/fba8acfc-17d7-4738-9e7b-58d51c0c8085-kube-api-access-g2dh7\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804773 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg55l\" (UniqueName: \"kubernetes.io/projected/509553d8-b894-456c-a45e-665e8497cdbc-kube-api-access-xg55l\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804794 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-cnibin\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804846 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-k8s-cni-cncf-io\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804867 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-hostroot\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804886 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-netns\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804905 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5bd\" (UniqueName: \"kubernetes.io/projected/888e124c-ec0f-4c32-bd78-1ff258933bde-kube-api-access-7f5bd\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/509553d8-b894-456c-a45e-665e8497cdbc-rootfs\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804946 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-socket-dir-parent\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.804978 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805005 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-system-cni-dir\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805059 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-os-release\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805083 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/888e124c-ec0f-4c32-bd78-1ff258933bde-cni-binary-copy\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805125 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-conf-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805088 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805146 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-multus-certs\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805277 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cni-binary-copy\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805300 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/509553d8-b894-456c-a45e-665e8497cdbc-proxy-tls\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-os-release\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805346 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-cni-multus\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805365 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-daemon-config\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805395 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-cni-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.805416 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-cni-bin\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.810795 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.810829 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.810841 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.810859 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.810873 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.819584 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.841826 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.857104 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.867061 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.880477 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.897553 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906609 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-cni-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906675 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-cni-bin\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906705 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/509553d8-b894-456c-a45e-665e8497cdbc-mcd-auth-proxy-config\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-kubelet\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906766 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-etc-kubernetes\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906804 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-cni-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906810 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906884 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-system-cni-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-cni-bin\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906907 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-kubelet\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-hostroot\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.906909 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-hostroot\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907020 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-system-cni-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907067 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cnibin\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-etc-kubernetes\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907133 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cnibin\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2dh7\" (UniqueName: \"kubernetes.io/projected/fba8acfc-17d7-4738-9e7b-58d51c0c8085-kube-api-access-g2dh7\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907248 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg55l\" (UniqueName: \"kubernetes.io/projected/509553d8-b894-456c-a45e-665e8497cdbc-kube-api-access-xg55l\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907271 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-cnibin\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-k8s-cni-cncf-io\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-netns\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907321 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5bd\" (UniqueName: \"kubernetes.io/projected/888e124c-ec0f-4c32-bd78-1ff258933bde-kube-api-access-7f5bd\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907342 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/509553d8-b894-456c-a45e-665e8497cdbc-rootfs\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907358 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-socket-dir-parent\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907423 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-system-cni-dir\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907440 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-conf-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907461 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-multus-certs\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907496 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-os-release\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907516 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/888e124c-ec0f-4c32-bd78-1ff258933bde-cni-binary-copy\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907545 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cni-binary-copy\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907567 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/509553d8-b894-456c-a45e-665e8497cdbc-proxy-tls\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-os-release\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907624 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-cni-multus\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907642 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-daemon-config\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907718 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.907800 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-netns\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908042 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/509553d8-b894-456c-a45e-665e8497cdbc-mcd-auth-proxy-config\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-cnibin\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908330 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-k8s-cni-cncf-io\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908365 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-conf-dir\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908401 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-system-cni-dir\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908411 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-daemon-config\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908476 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-os-release\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908503 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-var-lib-cni-multus\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-host-run-multus-certs\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908558 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/509553d8-b894-456c-a45e-665e8497cdbc-rootfs\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.908868 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fba8acfc-17d7-4738-9e7b-58d51c0c8085-os-release\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.909060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/888e124c-ec0f-4c32-bd78-1ff258933bde-cni-binary-copy\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.909205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/888e124c-ec0f-4c32-bd78-1ff258933bde-multus-socket-dir-parent\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.909351 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cni-binary-copy\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.911904 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.915552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fba8acfc-17d7-4738-9e7b-58d51c0c8085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.917174 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.917215 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.917231 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.917253 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.917269 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:16Z","lastTransitionTime":"2026-03-18T12:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.918403 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/509553d8-b894-456c-a45e-665e8497cdbc-proxy-tls\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.925586 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.928171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5bd\" (UniqueName: \"kubernetes.io/projected/888e124c-ec0f-4c32-bd78-1ff258933bde-kube-api-access-7f5bd\") pod \"multus-gkdzx\" (UID: \"888e124c-ec0f-4c32-bd78-1ff258933bde\") " pod="openshift-multus/multus-gkdzx" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.936387 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg55l\" (UniqueName: \"kubernetes.io/projected/509553d8-b894-456c-a45e-665e8497cdbc-kube-api-access-xg55l\") pod \"machine-config-daemon-fsfj7\" (UID: \"509553d8-b894-456c-a45e-665e8497cdbc\") " pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.938379 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.939108 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2dh7\" (UniqueName: \"kubernetes.io/projected/fba8acfc-17d7-4738-9e7b-58d51c0c8085-kube-api-access-g2dh7\") pod \"multus-additional-cni-plugins-46nj9\" (UID: \"fba8acfc-17d7-4738-9e7b-58d51c0c8085\") " pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.950858 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.960874 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.968060 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.977922 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.986959 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:16 crc kubenswrapper[4921]: I0318 12:11:16.994405 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.010347 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.019938 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.020005 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.020017 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.020036 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.020080 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.026158 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.080290 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.090671 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gkdzx" Mar 18 12:11:17 crc kubenswrapper[4921]: W0318 12:11:17.091915 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509553d8_b894_456c_a45e_665e8497cdbc.slice/crio-952743612ea7802d3a12b1968ae4ea2dfe8616938e2af3c008eb982d70e87d18 WatchSource:0}: Error finding container 952743612ea7802d3a12b1968ae4ea2dfe8616938e2af3c008eb982d70e87d18: Status 404 returned error can't find the container with id 952743612ea7802d3a12b1968ae4ea2dfe8616938e2af3c008eb982d70e87d18 Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.095822 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.100092 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.101299 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:11:17 crc kubenswrapper[4921]: W0318 12:11:17.101661 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888e124c_ec0f_4c32_bd78_1ff258933bde.slice/crio-82c70bb080e6437c076a156fe20d97b4b5c33267c6d341f0991c81f6a078612d WatchSource:0}: Error finding container 82c70bb080e6437c076a156fe20d97b4b5c33267c6d341f0991c81f6a078612d: Status 404 returned error can't find the container with id 82c70bb080e6437c076a156fe20d97b4b5c33267c6d341f0991c81f6a078612d Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.104520 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-46nj9" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.104918 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:17 crc kubenswrapper[4921]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 12:11:17 crc kubenswrapper[4921]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 12:11:17 crc kubenswrapper[4921]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f5bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-gkdzx_openshift-multus(888e124c-ec0f-4c32-bd78-1ff258933bde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:17 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.106164 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-gkdzx" podUID="888e124c-ec0f-4c32-bd78-1ff258933bde" Mar 18 12:11:17 crc kubenswrapper[4921]: W0318 12:11:17.120194 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba8acfc_17d7_4738_9e7b_58d51c0c8085.slice/crio-878e2ce0cb6124533d75372db703fcc1791353f57965348ebcb1ae00d6207698 WatchSource:0}: Error finding container 878e2ce0cb6124533d75372db703fcc1791353f57965348ebcb1ae00d6207698: Status 404 returned error can't find the container with id 878e2ce0cb6124533d75372db703fcc1791353f57965348ebcb1ae00d6207698 Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.122003 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.122033 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.122043 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.122060 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.122072 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.122406 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2dh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-46nj9_openshift-multus(fba8acfc-17d7-4738-9e7b-58d51c0c8085): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.124123 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-46nj9" podUID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.138670 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l6tb7"] Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.139669 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.143627 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.143799 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.143848 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.143641 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.144563 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.144718 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.145018 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.157745 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.172045 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.187521 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.196834 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.208973 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.210593 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-slash\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.210624 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-log-socket\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.210644 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-config\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.210665 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-systemd-units\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211282 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-node-log\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211420 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-netd\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211519 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-systemd\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211597 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-var-lib-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211698 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-ovn\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211743 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovn-node-metrics-cert\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211794 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-kubelet\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211822 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-script-lib\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211843 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-bin\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211914 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.211970 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-env-overrides\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.212026 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnh75\" (UniqueName: \"kubernetes.io/projected/357e939f-66df-4ef0-b64a-a846abdd1ecf-kube-api-access-gnh75\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.212074 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-etc-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.212107 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.212211 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-netns\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.221674 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.224525 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.224562 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.224577 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.224594 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.224606 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.232543 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.242536 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.264027 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.296357 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313240 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-netns\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-slash\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-log-socket\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-config\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313363 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313383 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-node-log\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313404 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-systemd-units\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313425 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-netd\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313455 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-systemd\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313477 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-var-lib-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovn-node-metrics-cert\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313522 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-ovn\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313541 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-kubelet\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313573 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-script-lib\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313594 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-bin\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313647 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-env-overrides\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313670 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnh75\" (UniqueName: \"kubernetes.io/projected/357e939f-66df-4ef0-b64a-a846abdd1ecf-kube-api-access-gnh75\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313691 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-etc-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313715 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313829 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-netns\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313856 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-slash\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.313882 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-log-socket\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-netd\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314615 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314559 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-var-lib-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314649 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-config\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-ovn\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314620 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314690 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-kubelet\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314588 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-systemd\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314674 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-bin\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314669 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-systemd-units\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314667 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-node-log\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.314645 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-etc-openvswitch\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.315103 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-env-overrides\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.315305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-script-lib\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.319456 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovn-node-metrics-cert\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.319542 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.327256 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.327441 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.327524 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.327608 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.327688 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.331673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnh75\" (UniqueName: \"kubernetes.io/projected/357e939f-66df-4ef0-b64a-a846abdd1ecf-kube-api-access-gnh75\") pod \"ovnkube-node-l6tb7\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.332067 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.429741 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.429781 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.429791 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.429807 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.429816 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.462021 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.477342 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:17 crc kubenswrapper[4921]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 12:11:17 crc kubenswrapper[4921]: apiVersion: v1 Mar 18 12:11:17 crc kubenswrapper[4921]: clusters: Mar 18 12:11:17 crc kubenswrapper[4921]: - cluster: Mar 18 12:11:17 crc kubenswrapper[4921]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 12:11:17 crc kubenswrapper[4921]: server: https://api-int.crc.testing:6443 Mar 18 12:11:17 crc kubenswrapper[4921]: name: default-cluster Mar 18 12:11:17 crc kubenswrapper[4921]: contexts: Mar 18 12:11:17 crc kubenswrapper[4921]: - context: Mar 18 12:11:17 crc kubenswrapper[4921]: cluster: default-cluster Mar 18 12:11:17 crc kubenswrapper[4921]: namespace: default Mar 18 12:11:17 crc kubenswrapper[4921]: user: default-auth Mar 18 12:11:17 crc kubenswrapper[4921]: name: default-context Mar 18 12:11:17 crc kubenswrapper[4921]: current-context: default-context Mar 18 12:11:17 crc kubenswrapper[4921]: kind: Config Mar 18 12:11:17 crc kubenswrapper[4921]: preferences: {} Mar 18 12:11:17 crc kubenswrapper[4921]: users: Mar 18 12:11:17 crc kubenswrapper[4921]: - name: default-auth Mar 18 12:11:17 crc kubenswrapper[4921]: user: Mar 18 12:11:17 crc kubenswrapper[4921]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:17 crc kubenswrapper[4921]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:17 crc kubenswrapper[4921]: EOF Mar 18 12:11:17 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnh75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-l6tb7_openshift-ovn-kubernetes(357e939f-66df-4ef0-b64a-a846abdd1ecf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:17 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.478510 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.533149 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.533197 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.533214 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.533236 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.533251 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.640426 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.640473 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.640489 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.640512 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.640530 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.642030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"1302961abac4154ed57ee62ff75a67eaca79098b936d587d758e976384f2c48e"} Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.643936 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:17 crc kubenswrapper[4921]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 12:11:17 crc kubenswrapper[4921]: apiVersion: v1 Mar 18 12:11:17 crc kubenswrapper[4921]: clusters: Mar 18 12:11:17 crc kubenswrapper[4921]: - cluster: Mar 18 12:11:17 crc kubenswrapper[4921]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 12:11:17 crc kubenswrapper[4921]: server: https://api-int.crc.testing:6443 Mar 18 12:11:17 crc kubenswrapper[4921]: name: default-cluster Mar 18 12:11:17 crc kubenswrapper[4921]: contexts: Mar 18 12:11:17 crc kubenswrapper[4921]: - context: Mar 18 12:11:17 crc kubenswrapper[4921]: cluster: default-cluster Mar 18 12:11:17 crc kubenswrapper[4921]: namespace: default Mar 18 12:11:17 crc kubenswrapper[4921]: user: default-auth Mar 18 12:11:17 crc kubenswrapper[4921]: name: default-context Mar 18 12:11:17 crc kubenswrapper[4921]: current-context: default-context Mar 18 12:11:17 crc kubenswrapper[4921]: kind: Config Mar 18 12:11:17 crc kubenswrapper[4921]: preferences: {} Mar 18 12:11:17 crc kubenswrapper[4921]: users: Mar 18 12:11:17 crc kubenswrapper[4921]: - name: default-auth Mar 18 12:11:17 crc kubenswrapper[4921]: user: Mar 18 12:11:17 crc kubenswrapper[4921]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:17 crc kubenswrapper[4921]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:17 crc kubenswrapper[4921]: EOF Mar 18 12:11:17 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnh75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-l6tb7_openshift-ovn-kubernetes(357e939f-66df-4ef0-b64a-a846abdd1ecf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:17 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.645139 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.645812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gkdzx" event={"ID":"888e124c-ec0f-4c32-bd78-1ff258933bde","Type":"ContainerStarted","Data":"82c70bb080e6437c076a156fe20d97b4b5c33267c6d341f0991c81f6a078612d"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.650368 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"952743612ea7802d3a12b1968ae4ea2dfe8616938e2af3c008eb982d70e87d18"} Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.650597 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:17 crc kubenswrapper[4921]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 12:11:17 crc kubenswrapper[4921]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 12:11:17 crc kubenswrapper[4921]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f5bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-gkdzx_openshift-multus(888e124c-ec0f-4c32-bd78-1ff258933bde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:17 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.651717 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ms8h" event={"ID":"576db9cc-6447-4ab5-b6d4-b9b68e48167e","Type":"ContainerStarted","Data":"a7b90ee66132eded595c8ffc4b6a76298d8d6f798793f118644cf7e25a2373b3"} Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.651738 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-gkdzx" podUID="888e124c-ec0f-4c32-bd78-1ff258933bde" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.651831 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.652735 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerStarted","Data":"878e2ce0cb6124533d75372db703fcc1791353f57965348ebcb1ae00d6207698"} Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.653854 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:17 crc kubenswrapper[4921]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:17 crc kubenswrapper[4921]: set -uo pipefail Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 12:11:17 crc kubenswrapper[4921]: HOSTS_FILE="/etc/hosts" Mar 18 12:11:17 crc kubenswrapper[4921]: TEMP_FILE="/etc/hosts.tmp" Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: # Make a temporary file with the old hosts file's attributes. Mar 18 12:11:17 crc kubenswrapper[4921]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 12:11:17 crc kubenswrapper[4921]: echo "Failed to preserve hosts file. Exiting." Mar 18 12:11:17 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:17 crc kubenswrapper[4921]: fi Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: while true; do Mar 18 12:11:17 crc kubenswrapper[4921]: declare -A svc_ips Mar 18 12:11:17 crc kubenswrapper[4921]: for svc in "${services[@]}"; do Mar 18 12:11:17 crc kubenswrapper[4921]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 12:11:17 crc kubenswrapper[4921]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 12:11:17 crc kubenswrapper[4921]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 12:11:17 crc kubenswrapper[4921]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 12:11:17 crc kubenswrapper[4921]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:17 crc kubenswrapper[4921]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:17 crc kubenswrapper[4921]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:17 crc kubenswrapper[4921]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 12:11:17 crc kubenswrapper[4921]: for i in ${!cmds[*]} Mar 18 12:11:17 crc kubenswrapper[4921]: do Mar 18 12:11:17 crc kubenswrapper[4921]: ips=($(eval "${cmds[i]}")) Mar 18 12:11:17 crc kubenswrapper[4921]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 12:11:17 crc kubenswrapper[4921]: svc_ips["${svc}"]="${ips[@]}" Mar 18 12:11:17 crc kubenswrapper[4921]: break Mar 18 12:11:17 crc kubenswrapper[4921]: fi Mar 18 12:11:17 crc kubenswrapper[4921]: done Mar 18 12:11:17 crc kubenswrapper[4921]: done Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: # Update /etc/hosts only if we get valid service IPs Mar 18 12:11:17 crc kubenswrapper[4921]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 12:11:17 crc kubenswrapper[4921]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 12:11:17 crc kubenswrapper[4921]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 12:11:17 crc kubenswrapper[4921]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 12:11:17 crc kubenswrapper[4921]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 12:11:17 crc kubenswrapper[4921]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 12:11:17 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:17 crc kubenswrapper[4921]: continue Mar 18 12:11:17 crc kubenswrapper[4921]: fi Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: # Append resolver entries for services Mar 18 12:11:17 crc kubenswrapper[4921]: rc=0 Mar 18 12:11:17 crc kubenswrapper[4921]: for svc in "${!svc_ips[@]}"; do Mar 18 12:11:17 crc kubenswrapper[4921]: for ip in ${svc_ips[${svc}]}; do Mar 18 12:11:17 crc kubenswrapper[4921]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 12:11:17 crc kubenswrapper[4921]: done Mar 18 12:11:17 crc kubenswrapper[4921]: done Mar 18 12:11:17 crc kubenswrapper[4921]: if [[ $rc -ne 0 ]]; then Mar 18 12:11:17 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:17 crc kubenswrapper[4921]: continue Mar 18 12:11:17 crc kubenswrapper[4921]: fi Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: Mar 18 12:11:17 crc kubenswrapper[4921]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 12:11:17 crc kubenswrapper[4921]: # Replace /etc/hosts with our modified version if needed Mar 18 12:11:17 crc kubenswrapper[4921]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 12:11:17 crc kubenswrapper[4921]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 12:11:17 crc kubenswrapper[4921]: fi Mar 18 12:11:17 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:17 crc kubenswrapper[4921]: unset svc_ips Mar 18 12:11:17 crc kubenswrapper[4921]: done Mar 18 12:11:17 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fx5k7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2ms8h_openshift-dns(576db9cc-6447-4ab5-b6d4-b9b68e48167e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:17 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.654505 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.655990 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.656086 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2ms8h" podUID="576db9cc-6447-4ab5-b6d4-b9b68e48167e" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.656026 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.657985 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2dh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-46nj9_openshift-multus(fba8acfc-17d7-4738-9e7b-58d51c0c8085): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:17 crc kubenswrapper[4921]: E0318 12:11:17.659644 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-46nj9" podUID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.664449 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.680384 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.690312 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.706931 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.721695 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.734842 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.743565 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.743602 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.743612 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.743626 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.743635 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.747749 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.762658 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.776461 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.791428 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.806410 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.819783 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.840239 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.846878 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.846943 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.846959 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.846977 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.846991 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.853513 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.864495 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.873897 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.883009 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.895070 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.910789 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.925188 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.935982 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.947917 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.949845 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.949893 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.949904 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.949922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.949936 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:17Z","lastTransitionTime":"2026-03-18T12:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:17 crc kubenswrapper[4921]: I0318 12:11:17.964372 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.021820 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.022046 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022150 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.022067856 +0000 UTC m=+105.571988525 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022204 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.022227 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022279 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.022258041 +0000 UTC m=+105.572178690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.022345 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.022429 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022454 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022507 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022531 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022586 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022599 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.022577299 +0000 UTC m=+105.572498148 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022641 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.022626721 +0000 UTC m=+105.572547400 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022678 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022692 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022701 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.022737 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:26.022730533 +0000 UTC m=+105.572651172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.052693 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.052813 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.052840 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.052879 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.052902 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.156668 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.156742 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.156753 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.156776 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.156796 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.208827 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.208927 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.209004 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.209025 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.209243 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.209339 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.259980 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.260040 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.260055 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.260075 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.260086 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.362797 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.362840 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.362852 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.362868 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.362881 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.469935 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.470032 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.470063 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.470103 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.470178 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.574289 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.574363 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.574386 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.574417 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.574440 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.677440 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.677500 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.677517 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.677540 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.677559 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.729155 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.729225 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.729250 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.729279 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.729301 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.745266 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.750405 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.750466 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.750491 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.750523 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.750542 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.766994 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.772584 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.772661 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.772683 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.772712 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.772729 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.788154 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.793721 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.793786 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.793803 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.793866 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.793883 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.809483 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.814578 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.814727 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.814749 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.814774 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.814795 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.829861 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:18 crc kubenswrapper[4921]: E0318 12:11:18.830134 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.832456 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.832545 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.832566 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.832590 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.832606 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.936209 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.936286 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.936321 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.936350 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:18 crc kubenswrapper[4921]: I0318 12:11:18.936371 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:18Z","lastTransitionTime":"2026-03-18T12:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.042332 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.042381 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.042398 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.042421 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.042438 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.145145 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.145190 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.145205 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.145227 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.145242 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.248030 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.248082 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.248098 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.248153 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.248179 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.351414 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.351483 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.351500 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.351523 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.351543 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.454629 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.454711 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.454734 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.454763 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.454785 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.558418 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.558500 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.558526 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.558553 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.558573 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.661149 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.661215 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.661231 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.661252 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.661276 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.764378 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.764454 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.764472 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.764498 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.764522 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.868026 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.868083 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.868100 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.868160 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.868177 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.970548 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.970620 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.970638 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.970664 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:19 crc kubenswrapper[4921]: I0318 12:11:19.970681 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:19Z","lastTransitionTime":"2026-03-18T12:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.074034 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.074094 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.074151 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.074175 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.074193 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.177859 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.177913 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.177925 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.177943 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.177954 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.208604 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.208713 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.208630 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:20 crc kubenswrapper[4921]: E0318 12:11:20.208826 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:20 crc kubenswrapper[4921]: E0318 12:11:20.208989 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:20 crc kubenswrapper[4921]: E0318 12:11:20.209100 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.281097 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.281188 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.281199 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.281214 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.281248 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.384159 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.384233 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.384245 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.384263 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.384274 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.486873 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.487028 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.487039 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.487055 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.487069 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.590427 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.590488 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.590505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.590531 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.590548 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.693520 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.693762 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.693796 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.693824 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.693880 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.798850 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.798925 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.798936 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.798952 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.799169 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.902316 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.902380 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.902399 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.902422 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:20 crc kubenswrapper[4921]: I0318 12:11:20.902440 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:20Z","lastTransitionTime":"2026-03-18T12:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.004641 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.004679 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.004688 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.004702 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.004713 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.107893 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.107977 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.108001 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.108031 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.108053 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: E0318 12:11:21.211799 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:21 crc kubenswrapper[4921]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:21 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:21 crc kubenswrapper[4921]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:21 crc kubenswrapper[4921]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:21 crc kubenswrapper[4921]: else Mar 18 12:11:21 crc kubenswrapper[4921]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:21 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:21 crc kubenswrapper[4921]: fi Mar 18 12:11:21 crc kubenswrapper[4921]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:21 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:21 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:21 crc kubenswrapper[4921]: E0318 12:11:21.212955 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.215092 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.215186 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.215204 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.215229 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.215246 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.227975 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.239720 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.256447 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.279522 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.296765 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.317191 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.317750 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.317806 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.317830 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.317859 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.317879 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.335160 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.355901 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.371958 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.384751 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.396284 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.409451 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.420852 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.420909 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.420929 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.420956 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.420974 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.524472 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.524539 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.524556 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.524582 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.524598 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.627286 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.627339 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.627362 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.627388 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.627406 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.730986 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.731035 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.731052 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.731075 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.731096 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.835721 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.837233 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.837320 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.837351 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.837373 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.940052 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.940140 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.940160 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.940185 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:21 crc kubenswrapper[4921]: I0318 12:11:21.940202 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:21Z","lastTransitionTime":"2026-03-18T12:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.042758 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.042821 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.042833 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.042851 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.042866 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.146289 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.146355 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.146377 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.146406 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.146428 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.208460 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.208522 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:22 crc kubenswrapper[4921]: E0318 12:11:22.208602 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:22 crc kubenswrapper[4921]: E0318 12:11:22.208690 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.208752 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:22 crc kubenswrapper[4921]: E0318 12:11:22.208828 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.249137 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.249188 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.249201 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.249220 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.249232 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.353079 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.353148 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.353158 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.353172 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.353182 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.455851 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.455911 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.455928 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.455951 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.455967 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.558663 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.558734 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.558753 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.558779 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.558796 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.661807 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.661861 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.661876 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.661897 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.661912 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.764836 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.764915 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.764937 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.764967 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.764986 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.867103 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.867178 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.867194 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.867216 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.867233 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.948495 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8s5jb"] Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.949199 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.951340 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.952397 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.952450 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.953417 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.962500 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.975591 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.975622 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.975651 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.975664 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.975675 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:22Z","lastTransitionTime":"2026-03-18T12:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.978406 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:22 crc kubenswrapper[4921]: I0318 12:11:22.991921 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.003738 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.010899 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.024373 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.037039 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.052201 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.060631 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.070039 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.076577 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43869354-ed6a-464a-8c55-8b05e3c9dc82-serviceca\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.076681 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43869354-ed6a-464a-8c55-8b05e3c9dc82-host\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.076726 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwvjb\" (UniqueName: \"kubernetes.io/projected/43869354-ed6a-464a-8c55-8b05e3c9dc82-kube-api-access-kwvjb\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.077688 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.077722 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.077731 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.077746 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.077756 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.078869 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.085550 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.093775 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.178039 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43869354-ed6a-464a-8c55-8b05e3c9dc82-host\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.178177 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwvjb\" (UniqueName: \"kubernetes.io/projected/43869354-ed6a-464a-8c55-8b05e3c9dc82-kube-api-access-kwvjb\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.178257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43869354-ed6a-464a-8c55-8b05e3c9dc82-serviceca\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.178298 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43869354-ed6a-464a-8c55-8b05e3c9dc82-host\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.180240 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43869354-ed6a-464a-8c55-8b05e3c9dc82-serviceca\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.181411 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.181463 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.181546 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.181650 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.181677 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.203363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwvjb\" (UniqueName: \"kubernetes.io/projected/43869354-ed6a-464a-8c55-8b05e3c9dc82-kube-api-access-kwvjb\") pod \"node-ca-8s5jb\" (UID: \"43869354-ed6a-464a-8c55-8b05e3c9dc82\") " pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.266780 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8s5jb" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.284931 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.284984 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.284997 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.285015 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.285027 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: W0318 12:11:23.287058 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43869354_ed6a_464a_8c55_8b05e3c9dc82.slice/crio-ee1a230a6f9e1cef4d066068e4f8a51e5546aa862b8c89e56f321ec7682ae8d3 WatchSource:0}: Error finding container ee1a230a6f9e1cef4d066068e4f8a51e5546aa862b8c89e56f321ec7682ae8d3: Status 404 returned error can't find the container with id ee1a230a6f9e1cef4d066068e4f8a51e5546aa862b8c89e56f321ec7682ae8d3 Mar 18 12:11:23 crc kubenswrapper[4921]: E0318 12:11:23.291220 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:23 crc kubenswrapper[4921]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 12:11:23 crc kubenswrapper[4921]: while [ true ]; Mar 18 12:11:23 crc kubenswrapper[4921]: do Mar 18 12:11:23 crc kubenswrapper[4921]: for f in $(ls /tmp/serviceca); do Mar 18 12:11:23 crc kubenswrapper[4921]: echo $f Mar 18 12:11:23 crc kubenswrapper[4921]: ca_file_path="/tmp/serviceca/${f}" Mar 18 12:11:23 crc kubenswrapper[4921]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 12:11:23 crc kubenswrapper[4921]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 12:11:23 crc kubenswrapper[4921]: if [ -e "${reg_dir_path}" ]; then Mar 18 12:11:23 crc kubenswrapper[4921]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 12:11:23 crc kubenswrapper[4921]: else Mar 18 12:11:23 crc kubenswrapper[4921]: mkdir $reg_dir_path Mar 18 12:11:23 crc kubenswrapper[4921]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 12:11:23 crc kubenswrapper[4921]: fi Mar 18 12:11:23 crc kubenswrapper[4921]: done Mar 18 12:11:23 crc kubenswrapper[4921]: for d in $(ls /etc/docker/certs.d); do Mar 18 12:11:23 crc kubenswrapper[4921]: echo $d Mar 18 12:11:23 crc kubenswrapper[4921]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 12:11:23 crc kubenswrapper[4921]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 12:11:23 crc kubenswrapper[4921]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 12:11:23 crc kubenswrapper[4921]: rm -rf /etc/docker/certs.d/$d Mar 18 12:11:23 crc kubenswrapper[4921]: fi Mar 18 12:11:23 crc kubenswrapper[4921]: done Mar 18 12:11:23 crc kubenswrapper[4921]: sleep 60 & wait ${!} Mar 18 12:11:23 crc kubenswrapper[4921]: done Mar 18 12:11:23 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwvjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-8s5jb_openshift-image-registry(43869354-ed6a-464a-8c55-8b05e3c9dc82): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:23 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:23 crc kubenswrapper[4921]: E0318 12:11:23.292412 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-8s5jb" podUID="43869354-ed6a-464a-8c55-8b05e3c9dc82" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.388329 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.388391 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.388404 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.388428 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.388483 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.491585 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.491629 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.491638 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.491654 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.491667 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.594600 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.594649 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.594660 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.594701 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.594714 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.672990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8s5jb" event={"ID":"43869354-ed6a-464a-8c55-8b05e3c9dc82","Type":"ContainerStarted","Data":"ee1a230a6f9e1cef4d066068e4f8a51e5546aa862b8c89e56f321ec7682ae8d3"} Mar 18 12:11:23 crc kubenswrapper[4921]: E0318 12:11:23.674651 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:23 crc kubenswrapper[4921]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 12:11:23 crc kubenswrapper[4921]: while [ true ]; Mar 18 12:11:23 crc kubenswrapper[4921]: do Mar 18 12:11:23 crc kubenswrapper[4921]: for f in $(ls /tmp/serviceca); do Mar 18 12:11:23 crc kubenswrapper[4921]: echo $f Mar 18 12:11:23 crc kubenswrapper[4921]: ca_file_path="/tmp/serviceca/${f}" Mar 18 12:11:23 crc kubenswrapper[4921]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 12:11:23 crc kubenswrapper[4921]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 12:11:23 crc kubenswrapper[4921]: if [ -e "${reg_dir_path}" ]; then Mar 18 12:11:23 crc kubenswrapper[4921]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 12:11:23 crc kubenswrapper[4921]: else Mar 18 12:11:23 crc kubenswrapper[4921]: mkdir $reg_dir_path Mar 18 12:11:23 crc kubenswrapper[4921]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 12:11:23 crc kubenswrapper[4921]: fi Mar 18 12:11:23 crc kubenswrapper[4921]: done Mar 18 12:11:23 crc kubenswrapper[4921]: for d in $(ls /etc/docker/certs.d); do Mar 18 12:11:23 crc kubenswrapper[4921]: echo $d Mar 18 12:11:23 crc kubenswrapper[4921]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 12:11:23 crc kubenswrapper[4921]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 12:11:23 crc kubenswrapper[4921]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 12:11:23 crc kubenswrapper[4921]: rm -rf /etc/docker/certs.d/$d Mar 18 12:11:23 crc kubenswrapper[4921]: fi Mar 18 12:11:23 crc kubenswrapper[4921]: done Mar 18 12:11:23 crc kubenswrapper[4921]: sleep 60 & wait ${!} Mar 18 12:11:23 crc kubenswrapper[4921]: done Mar 18 12:11:23 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwvjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-8s5jb_openshift-image-registry(43869354-ed6a-464a-8c55-8b05e3c9dc82): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:23 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:23 crc kubenswrapper[4921]: E0318 12:11:23.675931 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-8s5jb" podUID="43869354-ed6a-464a-8c55-8b05e3c9dc82" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.690909 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.698765 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.698835 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.698852 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.698913 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.698932 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.705268 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.719493 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.739884 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.750270 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.763248 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.775788 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.802904 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.802960 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.802975 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.802808 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.802996 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.803253 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.817247 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.835312 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.855045 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.867749 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.885079 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.910154 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.910196 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.910208 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.910225 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:23 crc kubenswrapper[4921]: I0318 12:11:23.910237 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:23Z","lastTransitionTime":"2026-03-18T12:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.012820 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.012865 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.012882 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.012905 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.012920 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.116377 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.116445 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.116470 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.116499 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.116516 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.208619 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.208834 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:24 crc kubenswrapper[4921]: E0318 12:11:24.208940 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.208834 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:24 crc kubenswrapper[4921]: E0318 12:11:24.209013 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:24 crc kubenswrapper[4921]: E0318 12:11:24.209192 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:24 crc kubenswrapper[4921]: E0318 12:11:24.211605 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:24 crc kubenswrapper[4921]: E0318 12:11:24.212979 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.219517 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.219551 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.219568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.219588 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.219602 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.322365 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.322447 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.322472 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.322505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.322531 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.427160 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.427241 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.427266 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.427295 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.427317 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.530179 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.530260 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.530280 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.530311 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.530330 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.634146 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.634211 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.634228 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.634257 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.634275 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.737568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.737637 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.737658 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.737686 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.737703 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.841743 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.841814 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.841835 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.841869 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.841887 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.945265 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.945337 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.945355 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.945382 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4921]: I0318 12:11:24.945399 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.048515 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.048592 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.048609 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.048626 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.048635 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.151845 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.151935 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.151960 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.151994 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.152019 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.209664 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:11:25 crc kubenswrapper[4921]: E0318 12:11:25.211483 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:25 crc kubenswrapper[4921]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:25 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:25 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:25 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:25 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:25 crc kubenswrapper[4921]: fi Mar 18 12:11:25 crc kubenswrapper[4921]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:25 crc kubenswrapper[4921]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:25 crc kubenswrapper[4921]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:25 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:25 crc kubenswrapper[4921]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:25 crc kubenswrapper[4921]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:25 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:25 crc kubenswrapper[4921]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:25 crc kubenswrapper[4921]: --webhook-host=127.0.0.1 \ Mar 18 12:11:25 crc kubenswrapper[4921]: --webhook-port=9743 \ Mar 18 12:11:25 crc kubenswrapper[4921]: ${ho_enable} \ Mar 18 12:11:25 crc kubenswrapper[4921]: --enable-interconnect \ Mar 18 12:11:25 crc kubenswrapper[4921]: --disable-approver \ Mar 18 12:11:25 crc kubenswrapper[4921]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:25 crc kubenswrapper[4921]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:25 crc kubenswrapper[4921]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:25 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:25 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:25 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:25 crc kubenswrapper[4921]: E0318 12:11:25.214001 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:25 crc kubenswrapper[4921]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:25 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:25 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:25 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:25 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:25 crc kubenswrapper[4921]: fi Mar 18 12:11:25 crc kubenswrapper[4921]: Mar 18 12:11:25 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:25 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:25 crc kubenswrapper[4921]: --disable-webhook \ Mar 18 12:11:25 crc kubenswrapper[4921]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:25 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:25 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:25 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:25 crc kubenswrapper[4921]: E0318 12:11:25.215405 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.255260 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.255348 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.255376 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.255409 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.255432 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.358529 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.358578 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.358591 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.358613 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.358625 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.461965 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.462051 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.462096 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.462172 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.462194 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.567825 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.567880 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.567898 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.567921 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.567939 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.675056 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.675164 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.675186 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.675218 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.675248 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.684447 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.686775 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.687333 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.706193 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.716532 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.731907 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.747816 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.762502 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.776312 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.778394 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.778468 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.778488 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.778513 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.778531 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.796646 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.808571 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.820435 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.829280 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.838951 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.848745 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.858245 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.881922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.881986 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.882009 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.882040 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.882068 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.986441 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.986478 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.986490 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.986504 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:25 crc kubenswrapper[4921]: I0318 12:11:25.986515 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:25Z","lastTransitionTime":"2026-03-18T12:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.089949 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.089980 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.089991 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.090004 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.090014 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.111841 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.111908 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.111937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.111959 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.111979 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112078 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112152 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112165 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:42.112095037 +0000 UTC m=+121.662015686 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112215 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:42.11220192 +0000 UTC m=+121.662122569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112235 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112237 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:42.112225781 +0000 UTC m=+121.662146500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112249 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112262 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112299 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:42.112285583 +0000 UTC m=+121.662206222 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112344 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112354 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112362 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.112379 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:42.112373875 +0000 UTC m=+121.662294504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.194056 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.194173 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.194199 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.194226 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.194245 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.208587 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.208663 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.208713 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.208584 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.208836 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:26 crc kubenswrapper[4921]: E0318 12:11:26.208982 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.297795 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.297836 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.297847 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.297864 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.297875 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.400638 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.400687 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.400702 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.400718 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.400729 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.504538 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.504575 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.504584 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.504599 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.504608 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.608095 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.608182 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.608203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.608234 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.608254 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.710306 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.710354 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.710369 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.710388 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.710403 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.813549 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.813638 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.813659 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.813687 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.813707 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.917060 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.917095 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.917102 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.917127 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:26 crc kubenswrapper[4921]: I0318 12:11:26.917135 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:26Z","lastTransitionTime":"2026-03-18T12:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.019620 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.019708 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.019734 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.019766 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.019788 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.122795 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.122844 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.122862 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.122885 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.122903 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.225968 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.226005 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.226018 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.226036 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.226049 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.332561 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.332618 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.332630 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.332649 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.332665 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.436346 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.436412 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.436428 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.436453 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.436470 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.539531 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.539606 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.539628 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.539657 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.539678 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.642563 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.642630 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.642651 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.642678 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.642696 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.746145 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.746202 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.746218 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.746240 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.746255 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.850209 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.850279 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.850312 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.850343 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.850363 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.954052 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.954143 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.954172 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.954204 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:27 crc kubenswrapper[4921]: I0318 12:11:27.954227 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:27Z","lastTransitionTime":"2026-03-18T12:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.057853 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.057924 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.057948 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.057976 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.057999 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.160977 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.161035 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.161047 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.161065 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.161076 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.208466 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.208614 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.208628 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:28 crc kubenswrapper[4921]: E0318 12:11:28.208943 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:28 crc kubenswrapper[4921]: E0318 12:11:28.209055 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:28 crc kubenswrapper[4921]: E0318 12:11:28.209248 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:28 crc kubenswrapper[4921]: E0318 12:11:28.211549 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:28 crc kubenswrapper[4921]: E0318 12:11:28.213599 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:28 crc kubenswrapper[4921]: E0318 12:11:28.214738 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.264083 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.264192 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.264280 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.264315 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.264337 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.366951 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.366995 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.367005 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.367023 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.367034 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.470648 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.470725 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.470747 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.470774 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.470800 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.573583 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.573622 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.573631 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.573648 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.573657 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.676887 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.677047 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.677069 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.677093 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.677137 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.779854 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.779896 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.779917 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.779939 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.779955 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.788300 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg"] Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.788898 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.792051 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.792217 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.808801 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.822654 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.834012 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.846551 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2119089d-3cd5-4294-8e34-f5dcb27e0e34-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.846612 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2119089d-3cd5-4294-8e34-f5dcb27e0e34-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.846713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2119089d-3cd5-4294-8e34-f5dcb27e0e34-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.846767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqwnz\" (UniqueName: \"kubernetes.io/projected/2119089d-3cd5-4294-8e34-f5dcb27e0e34-kube-api-access-nqwnz\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.853913 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.870166 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.882551 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.882595 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.882610 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.882630 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.882643 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.885395 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.900853 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.915526 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.941526 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.947773 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqwnz\" (UniqueName: \"kubernetes.io/projected/2119089d-3cd5-4294-8e34-f5dcb27e0e34-kube-api-access-nqwnz\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.947883 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2119089d-3cd5-4294-8e34-f5dcb27e0e34-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.947936 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2119089d-3cd5-4294-8e34-f5dcb27e0e34-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.948175 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2119089d-3cd5-4294-8e34-f5dcb27e0e34-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.949317 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2119089d-3cd5-4294-8e34-f5dcb27e0e34-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.949772 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2119089d-3cd5-4294-8e34-f5dcb27e0e34-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.957930 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.961939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2119089d-3cd5-4294-8e34-f5dcb27e0e34-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.972086 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.980987 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqwnz\" (UniqueName: \"kubernetes.io/projected/2119089d-3cd5-4294-8e34-f5dcb27e0e34-kube-api-access-nqwnz\") pod \"ovnkube-control-plane-749d76644c-6t6wg\" (UID: \"2119089d-3cd5-4294-8e34-f5dcb27e0e34\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.985395 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.985449 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.985472 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.985503 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.985526 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:28Z","lastTransitionTime":"2026-03-18T12:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:28 crc kubenswrapper[4921]: I0318 12:11:28.986799 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.003146 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.014538 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.088685 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.088731 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.088743 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.088760 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.088773 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.112507 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" Mar 18 12:11:29 crc kubenswrapper[4921]: W0318 12:11:29.131412 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2119089d_3cd5_4294_8e34_f5dcb27e0e34.slice/crio-ed65bf7da6e6642b552142bdd5396da17eecee23d64b5e8e7cb328ba4525c655 WatchSource:0}: Error finding container ed65bf7da6e6642b552142bdd5396da17eecee23d64b5e8e7cb328ba4525c655: Status 404 returned error can't find the container with id ed65bf7da6e6642b552142bdd5396da17eecee23d64b5e8e7cb328ba4525c655 Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.135166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.135203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.135217 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.135238 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.135252 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.136319 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:29 crc kubenswrapper[4921]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:29 crc kubenswrapper[4921]: set -euo pipefail Mar 18 12:11:29 crc kubenswrapper[4921]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 12:11:29 crc kubenswrapper[4921]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 12:11:29 crc kubenswrapper[4921]: # As the secret mount is optional we must wait for the files to be present. Mar 18 12:11:29 crc kubenswrapper[4921]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 12:11:29 crc kubenswrapper[4921]: TS=$(date +%s) Mar 18 12:11:29 crc kubenswrapper[4921]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 12:11:29 crc kubenswrapper[4921]: HAS_LOGGED_INFO=0 Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: log_missing_certs(){ Mar 18 12:11:29 crc kubenswrapper[4921]: CUR_TS=$(date +%s) Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 12:11:29 crc kubenswrapper[4921]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 12:11:29 crc kubenswrapper[4921]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 12:11:29 crc kubenswrapper[4921]: HAS_LOGGED_INFO=1 Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: } Mar 18 12:11:29 crc kubenswrapper[4921]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 12:11:29 crc kubenswrapper[4921]: log_missing_certs Mar 18 12:11:29 crc kubenswrapper[4921]: sleep 5 Mar 18 12:11:29 crc kubenswrapper[4921]: done Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 12:11:29 crc kubenswrapper[4921]: exec /usr/bin/kube-rbac-proxy \ Mar 18 12:11:29 crc kubenswrapper[4921]: --logtostderr \ Mar 18 12:11:29 crc kubenswrapper[4921]: --secure-listen-address=:9108 \ Mar 18 12:11:29 crc kubenswrapper[4921]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 12:11:29 crc kubenswrapper[4921]: --upstream=http://127.0.0.1:29108/ \ Mar 18 12:11:29 crc kubenswrapper[4921]: --tls-private-key-file=${TLS_PK} \ Mar 18 12:11:29 crc kubenswrapper[4921]: --tls-cert-file=${TLS_CERT} Mar 18 12:11:29 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-6t6wg_openshift-ovn-kubernetes(2119089d-3cd5-4294-8e34-f5dcb27e0e34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:29 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.139257 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:29 crc kubenswrapper[4921]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:29 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:29 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_join_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_join_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_transit_switch_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_transit_switch_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: dns_name_resolver_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "false" == "true" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: persistent_ips_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "true" == "true" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: # This is needed so that converting clusters from GA to TP Mar 18 12:11:29 crc kubenswrapper[4921]: # will rollout control plane pods as well Mar 18 12:11:29 crc kubenswrapper[4921]: network_segmentation_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: multi_network_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "true" == "true" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: multi_network_enabled_flag="--enable-multi-network" Mar 18 12:11:29 crc kubenswrapper[4921]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 12:11:29 crc kubenswrapper[4921]: exec /usr/bin/ovnkube \ Mar 18 12:11:29 crc kubenswrapper[4921]: --enable-interconnect \ Mar 18 12:11:29 crc kubenswrapper[4921]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 12:11:29 crc kubenswrapper[4921]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 12:11:29 crc kubenswrapper[4921]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 12:11:29 crc kubenswrapper[4921]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 12:11:29 crc kubenswrapper[4921]: --metrics-enable-pprof \ Mar 18 12:11:29 crc kubenswrapper[4921]: --metrics-enable-config-duration \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v4_join_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v6_join_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${dns_name_resolver_enabled_flag} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${persistent_ips_enabled_flag} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${multi_network_enabled_flag} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${network_segmentation_enabled_flag} Mar 18 12:11:29 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-6t6wg_openshift-ovn-kubernetes(2119089d-3cd5-4294-8e34-f5dcb27e0e34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:29 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.140702 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" podUID="2119089d-3cd5-4294-8e34-f5dcb27e0e34" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.151596 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.155377 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.155413 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.155424 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.155442 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.155455 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.171761 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.175456 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.175487 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.175498 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.175559 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.175571 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.188935 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.192870 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.192909 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.192925 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.192944 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.192957 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.206659 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.210224 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.210390 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.210509 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.210607 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.210698 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.221315 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.221475 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.222845 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.222872 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.222883 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.222898 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.222908 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.325531 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.325579 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.325590 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.325605 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.325613 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.428353 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.428411 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.428430 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.428450 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.428466 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.504691 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g8926"] Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.505258 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.505341 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.516555 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.527580 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.532067 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.532141 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.532155 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.532171 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.532183 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.537451 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.547795 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.557815 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.575133 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.582957 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.599372 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.609024 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.619298 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.633174 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.634833 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.634878 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.634891 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.634910 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.634921 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.656917 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.656980 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrt7\" (UniqueName: \"kubernetes.io/projected/b6113b50-f73b-4839-ae08-2bc7d4abb024-kube-api-access-msrt7\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.657970 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.668015 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.679031 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.686246 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.701758 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" event={"ID":"2119089d-3cd5-4294-8e34-f5dcb27e0e34","Type":"ContainerStarted","Data":"ed65bf7da6e6642b552142bdd5396da17eecee23d64b5e8e7cb328ba4525c655"} Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.704293 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:29 crc kubenswrapper[4921]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:29 crc kubenswrapper[4921]: set -euo pipefail Mar 18 12:11:29 crc kubenswrapper[4921]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 18 12:11:29 crc kubenswrapper[4921]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 18 12:11:29 crc kubenswrapper[4921]: # As the secret mount is optional we must wait for the files to be present. Mar 18 12:11:29 crc kubenswrapper[4921]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 18 12:11:29 crc kubenswrapper[4921]: TS=$(date +%s) Mar 18 12:11:29 crc kubenswrapper[4921]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 18 12:11:29 crc kubenswrapper[4921]: HAS_LOGGED_INFO=0 Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: log_missing_certs(){ Mar 18 12:11:29 crc kubenswrapper[4921]: CUR_TS=$(date +%s) Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 18 12:11:29 crc kubenswrapper[4921]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 18 12:11:29 crc kubenswrapper[4921]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 18 12:11:29 crc kubenswrapper[4921]: HAS_LOGGED_INFO=1 Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: } Mar 18 12:11:29 crc kubenswrapper[4921]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 18 12:11:29 crc kubenswrapper[4921]: log_missing_certs Mar 18 12:11:29 crc kubenswrapper[4921]: sleep 5 Mar 18 12:11:29 crc kubenswrapper[4921]: done Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 18 12:11:29 crc kubenswrapper[4921]: exec /usr/bin/kube-rbac-proxy \ Mar 18 12:11:29 crc kubenswrapper[4921]: --logtostderr \ Mar 18 12:11:29 crc kubenswrapper[4921]: --secure-listen-address=:9108 \ Mar 18 12:11:29 crc kubenswrapper[4921]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 18 12:11:29 crc kubenswrapper[4921]: --upstream=http://127.0.0.1:29108/ \ Mar 18 12:11:29 crc kubenswrapper[4921]: --tls-private-key-file=${TLS_PK} \ Mar 18 12:11:29 crc kubenswrapper[4921]: --tls-cert-file=${TLS_CERT} Mar 18 12:11:29 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-6t6wg_openshift-ovn-kubernetes(2119089d-3cd5-4294-8e34-f5dcb27e0e34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:29 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.707187 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:29 crc kubenswrapper[4921]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:29 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:29 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_join_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_join_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_transit_switch_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_transit_switch_subnet_opt= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "" != "" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: dns_name_resolver_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "false" == "true" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: persistent_ips_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "true" == "true" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: # This is needed so that converting clusters from GA to TP Mar 18 12:11:29 crc kubenswrapper[4921]: # will rollout control plane pods as well Mar 18 12:11:29 crc kubenswrapper[4921]: network_segmentation_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: multi_network_enabled_flag= Mar 18 12:11:29 crc kubenswrapper[4921]: if [[ "true" == "true" ]]; then Mar 18 12:11:29 crc kubenswrapper[4921]: multi_network_enabled_flag="--enable-multi-network" Mar 18 12:11:29 crc kubenswrapper[4921]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 18 12:11:29 crc kubenswrapper[4921]: fi Mar 18 12:11:29 crc kubenswrapper[4921]: Mar 18 12:11:29 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 18 12:11:29 crc kubenswrapper[4921]: exec /usr/bin/ovnkube \ Mar 18 12:11:29 crc kubenswrapper[4921]: --enable-interconnect \ Mar 18 12:11:29 crc kubenswrapper[4921]: --init-cluster-manager "${K8S_NODE}" \ Mar 18 12:11:29 crc kubenswrapper[4921]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 18 12:11:29 crc kubenswrapper[4921]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 18 12:11:29 crc kubenswrapper[4921]: --metrics-bind-address "127.0.0.1:29108" \ Mar 18 12:11:29 crc kubenswrapper[4921]: --metrics-enable-pprof \ Mar 18 12:11:29 crc kubenswrapper[4921]: --metrics-enable-config-duration \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v4_join_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v6_join_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${dns_name_resolver_enabled_flag} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${persistent_ips_enabled_flag} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${multi_network_enabled_flag} \ Mar 18 12:11:29 crc kubenswrapper[4921]: ${network_segmentation_enabled_flag} Mar 18 12:11:29 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-6t6wg_openshift-ovn-kubernetes(2119089d-3cd5-4294-8e34-f5dcb27e0e34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:29 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.708450 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" podUID="2119089d-3cd5-4294-8e34-f5dcb27e0e34" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.714059 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.720989 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.729981 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.737755 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.737788 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.737799 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.737814 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.737828 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.737927 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.744407 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.752401 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.758257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.758481 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrt7\" (UniqueName: \"kubernetes.io/projected/b6113b50-f73b-4839-ae08-2bc7d4abb024-kube-api-access-msrt7\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.758488 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:29 crc kubenswrapper[4921]: E0318 12:11:29.758788 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs podName:b6113b50-f73b-4839-ae08-2bc7d4abb024 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:30.258768439 +0000 UTC m=+109.808689158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs") pod "network-metrics-daemon-g8926" (UID: "b6113b50-f73b-4839-ae08-2bc7d4abb024") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.761635 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.773954 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.774565 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrt7\" (UniqueName: \"kubernetes.io/projected/b6113b50-f73b-4839-ae08-2bc7d4abb024-kube-api-access-msrt7\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.788598 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.795300 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.807908 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.822072 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.830613 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.840348 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.840428 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.840442 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.840483 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.840496 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.841422 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.856872 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.944795 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.944857 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.944876 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.944899 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4921]: I0318 12:11:29.944916 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.048282 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.048375 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.048411 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.048439 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.048460 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.151575 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.151884 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.151906 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.151930 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.151947 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.208874 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.208874 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.208900 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.209310 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.209436 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.209598 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.212253 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:30 crc kubenswrapper[4921]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:30 crc kubenswrapper[4921]: set -uo pipefail Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 18 12:11:30 crc kubenswrapper[4921]: HOSTS_FILE="/etc/hosts" Mar 18 12:11:30 crc kubenswrapper[4921]: TEMP_FILE="/etc/hosts.tmp" Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: # Make a temporary file with the old hosts file's attributes. Mar 18 12:11:30 crc kubenswrapper[4921]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 18 12:11:30 crc kubenswrapper[4921]: echo "Failed to preserve hosts file. Exiting." Mar 18 12:11:30 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:30 crc kubenswrapper[4921]: fi Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: while true; do Mar 18 12:11:30 crc kubenswrapper[4921]: declare -A svc_ips Mar 18 12:11:30 crc kubenswrapper[4921]: for svc in "${services[@]}"; do Mar 18 12:11:30 crc kubenswrapper[4921]: # Fetch service IP from cluster dns if present. We make several tries Mar 18 12:11:30 crc kubenswrapper[4921]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 18 12:11:30 crc kubenswrapper[4921]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 18 12:11:30 crc kubenswrapper[4921]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 18 12:11:30 crc kubenswrapper[4921]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:30 crc kubenswrapper[4921]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:30 crc kubenswrapper[4921]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 18 12:11:30 crc kubenswrapper[4921]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 18 12:11:30 crc kubenswrapper[4921]: for i in ${!cmds[*]} Mar 18 12:11:30 crc kubenswrapper[4921]: do Mar 18 12:11:30 crc kubenswrapper[4921]: ips=($(eval "${cmds[i]}")) Mar 18 12:11:30 crc kubenswrapper[4921]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 18 12:11:30 crc kubenswrapper[4921]: svc_ips["${svc}"]="${ips[@]}" Mar 18 12:11:30 crc kubenswrapper[4921]: break Mar 18 12:11:30 crc kubenswrapper[4921]: fi Mar 18 12:11:30 crc kubenswrapper[4921]: done Mar 18 12:11:30 crc kubenswrapper[4921]: done Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: # Update /etc/hosts only if we get valid service IPs Mar 18 12:11:30 crc kubenswrapper[4921]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 18 12:11:30 crc kubenswrapper[4921]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 18 12:11:30 crc kubenswrapper[4921]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 18 12:11:30 crc kubenswrapper[4921]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 18 12:11:30 crc kubenswrapper[4921]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 18 12:11:30 crc kubenswrapper[4921]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 18 12:11:30 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:30 crc kubenswrapper[4921]: continue Mar 18 12:11:30 crc kubenswrapper[4921]: fi Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: # Append resolver entries for services Mar 18 12:11:30 crc kubenswrapper[4921]: rc=0 Mar 18 12:11:30 crc kubenswrapper[4921]: for svc in "${!svc_ips[@]}"; do Mar 18 12:11:30 crc kubenswrapper[4921]: for ip in ${svc_ips[${svc}]}; do Mar 18 12:11:30 crc kubenswrapper[4921]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 18 12:11:30 crc kubenswrapper[4921]: done Mar 18 12:11:30 crc kubenswrapper[4921]: done Mar 18 12:11:30 crc kubenswrapper[4921]: if [[ $rc -ne 0 ]]; then Mar 18 12:11:30 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:30 crc kubenswrapper[4921]: continue Mar 18 12:11:30 crc kubenswrapper[4921]: fi Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: Mar 18 12:11:30 crc kubenswrapper[4921]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 18 12:11:30 crc kubenswrapper[4921]: # Replace /etc/hosts with our modified version if needed Mar 18 12:11:30 crc kubenswrapper[4921]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 18 12:11:30 crc kubenswrapper[4921]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 18 12:11:30 crc kubenswrapper[4921]: fi Mar 18 12:11:30 crc kubenswrapper[4921]: sleep 60 & wait Mar 18 12:11:30 crc kubenswrapper[4921]: unset svc_ips Mar 18 12:11:30 crc kubenswrapper[4921]: done Mar 18 12:11:30 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fx5k7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-2ms8h_openshift-dns(576db9cc-6447-4ab5-b6d4-b9b68e48167e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:30 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.213720 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-2ms8h" podUID="576db9cc-6447-4ab5-b6d4-b9b68e48167e" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.255236 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.255298 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.255317 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.255341 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.255358 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.264693 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.264851 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:30 crc kubenswrapper[4921]: E0318 12:11:30.264911 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs podName:b6113b50-f73b-4839-ae08-2bc7d4abb024 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:31.264891363 +0000 UTC m=+110.814812012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs") pod "network-metrics-daemon-g8926" (UID: "b6113b50-f73b-4839-ae08-2bc7d4abb024") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.358159 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.358206 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.358219 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.358238 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.358250 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.461080 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.461197 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.461220 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.461255 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.461280 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.564682 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.566597 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.566617 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.566637 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.566648 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.669857 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.669920 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.669942 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.669967 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.669984 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.771910 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.771955 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.771966 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.771982 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.771992 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.873842 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.873905 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.873923 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.873946 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.873963 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.976950 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.977008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.977025 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.977051 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4921]: I0318 12:11:30.977068 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.080140 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.080212 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.080226 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.080249 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.080265 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.183347 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.183409 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.183433 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.183463 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.183489 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.208741 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.209383 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.212031 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:31 crc kubenswrapper[4921]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 18 12:11:31 crc kubenswrapper[4921]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 18 12:11:31 crc kubenswrapper[4921]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f5bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-gkdzx_openshift-multus(888e124c-ec0f-4c32-bd78-1ff258933bde): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:31 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.212277 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2dh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-46nj9_openshift-multus(fba8acfc-17d7-4738-9e7b-58d51c0c8085): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.213603 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-46nj9" podUID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.213625 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-gkdzx" podUID="888e124c-ec0f-4c32-bd78-1ff258933bde" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.222172 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.237692 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.250648 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.261996 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.273792 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.276290 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.276442 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:31 crc kubenswrapper[4921]: E0318 12:11:31.276532 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs podName:b6113b50-f73b-4839-ae08-2bc7d4abb024 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.276513693 +0000 UTC m=+112.826434332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs") pod "network-metrics-daemon-g8926" (UID: "b6113b50-f73b-4839-ae08-2bc7d4abb024") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.283058 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.287319 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.287356 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.287367 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.287386 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.287399 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.290308 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.301932 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.311684 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.327391 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.335415 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.344076 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.352342 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.367511 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.376072 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.389847 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.390026 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.390190 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.390330 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.390486 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.494069 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.494146 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.494157 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.494176 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.494187 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.596982 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.597399 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.597483 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.597573 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.597668 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.701122 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.701203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.701217 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.701276 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.701292 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.804695 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.804746 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.804765 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.804787 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.804805 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.908568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.908625 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.908637 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.908660 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4921]: I0318 12:11:31.908677 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.011377 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.011413 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.011421 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.011434 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.011443 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.114665 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.114770 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.114806 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.114842 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.114868 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.209148 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.209159 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.209197 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:32 crc kubenswrapper[4921]: E0318 12:11:32.209364 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:32 crc kubenswrapper[4921]: E0318 12:11:32.209577 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:32 crc kubenswrapper[4921]: E0318 12:11:32.210302 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:32 crc kubenswrapper[4921]: E0318 12:11:32.212732 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:32 crc kubenswrapper[4921]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 18 12:11:32 crc kubenswrapper[4921]: apiVersion: v1 Mar 18 12:11:32 crc kubenswrapper[4921]: clusters: Mar 18 12:11:32 crc kubenswrapper[4921]: - cluster: Mar 18 12:11:32 crc kubenswrapper[4921]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 18 12:11:32 crc kubenswrapper[4921]: server: https://api-int.crc.testing:6443 Mar 18 12:11:32 crc kubenswrapper[4921]: name: default-cluster Mar 18 12:11:32 crc kubenswrapper[4921]: contexts: Mar 18 12:11:32 crc kubenswrapper[4921]: - context: Mar 18 12:11:32 crc kubenswrapper[4921]: cluster: default-cluster Mar 18 12:11:32 crc kubenswrapper[4921]: namespace: default Mar 18 12:11:32 crc kubenswrapper[4921]: user: default-auth Mar 18 12:11:32 crc kubenswrapper[4921]: name: default-context Mar 18 12:11:32 crc kubenswrapper[4921]: current-context: default-context Mar 18 12:11:32 crc kubenswrapper[4921]: kind: Config Mar 18 12:11:32 crc kubenswrapper[4921]: preferences: {} Mar 18 12:11:32 crc kubenswrapper[4921]: users: Mar 18 12:11:32 crc kubenswrapper[4921]: - name: default-auth Mar 18 12:11:32 crc kubenswrapper[4921]: user: Mar 18 12:11:32 crc kubenswrapper[4921]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:32 crc kubenswrapper[4921]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 18 12:11:32 crc kubenswrapper[4921]: EOF Mar 18 12:11:32 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnh75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-l6tb7_openshift-ovn-kubernetes(357e939f-66df-4ef0-b64a-a846abdd1ecf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:32 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:32 crc kubenswrapper[4921]: E0318 12:11:32.214013 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.217014 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.217147 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.217166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.217184 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.217199 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.320286 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.320360 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.320385 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.320417 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.320440 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.424237 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.424324 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.424363 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.424394 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.424415 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.527071 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.527190 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.527210 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.527234 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.527254 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.630936 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.631006 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.631022 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.631048 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.631067 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.734195 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.734281 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.734297 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.734318 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.734333 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.838066 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.838192 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.838217 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.838242 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.838262 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.941621 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.941681 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.941694 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.941711 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4921]: I0318 12:11:32.941721 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.045269 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.045329 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.045339 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.045352 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.045361 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.148860 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.148944 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.148981 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.149012 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.149033 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.208306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:33 crc kubenswrapper[4921]: E0318 12:11:33.208538 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:33 crc kubenswrapper[4921]: E0318 12:11:33.211925 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:33 crc kubenswrapper[4921]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:33 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:33 crc kubenswrapper[4921]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:33 crc kubenswrapper[4921]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:33 crc kubenswrapper[4921]: else Mar 18 12:11:33 crc kubenswrapper[4921]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:33 crc kubenswrapper[4921]: exit 1 Mar 18 12:11:33 crc kubenswrapper[4921]: fi Mar 18 12:11:33 crc kubenswrapper[4921]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:33 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:33 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:33 crc kubenswrapper[4921]: E0318 12:11:33.213253 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.252705 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.252755 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.252769 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.252787 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.252799 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.297889 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:33 crc kubenswrapper[4921]: E0318 12:11:33.298140 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:33 crc kubenswrapper[4921]: E0318 12:11:33.298254 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs podName:b6113b50-f73b-4839-ae08-2bc7d4abb024 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:37.298217625 +0000 UTC m=+116.848138274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs") pod "network-metrics-daemon-g8926" (UID: "b6113b50-f73b-4839-ae08-2bc7d4abb024") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.355405 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.355462 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.355474 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.355492 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.355505 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.458687 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.458769 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.458789 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.458818 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.458841 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.562486 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.562558 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.562602 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.562626 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.562637 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.665560 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.665604 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.665615 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.665631 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.665642 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.769904 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.769979 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.770000 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.770027 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.770050 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.872196 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.872253 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.872265 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.872284 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.872294 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.974917 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.974979 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.974999 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.975024 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4921]: I0318 12:11:33.975042 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.078615 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.078683 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.078706 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.078734 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.078757 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.181981 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.182049 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.182066 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.182089 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.182111 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.208871 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.208954 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.208888 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:34 crc kubenswrapper[4921]: E0318 12:11:34.209014 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:34 crc kubenswrapper[4921]: E0318 12:11:34.209188 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:34 crc kubenswrapper[4921]: E0318 12:11:34.209237 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.284971 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.285018 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.285035 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.285057 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.285072 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.388238 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.388296 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.388313 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.388335 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.388352 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.491340 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.491416 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.491433 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.491452 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.491465 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.594586 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.594646 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.594664 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.594686 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.594702 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.697604 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.697663 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.697675 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.697693 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.697706 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.801192 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.801229 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.801257 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.801275 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.801286 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.905011 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.905055 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.905067 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.905083 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4921]: I0318 12:11:34.905093 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.008754 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.008814 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.008825 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.008840 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.008850 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.111318 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.111383 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.111399 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.111424 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.111441 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.208972 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:35 crc kubenswrapper[4921]: E0318 12:11:35.209266 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.215388 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.215420 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.215431 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.215444 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.215456 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.318020 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.318062 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.318074 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.318091 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.318102 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.421759 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.421815 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.421835 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.421858 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.421876 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.525177 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.525241 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.525259 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.525288 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.525305 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.628867 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.628938 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.628956 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.628981 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.629004 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.732324 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.732370 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.732385 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.732406 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.732423 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.835803 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.835904 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.835931 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.835969 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.835992 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.939134 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.939176 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.939185 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.939200 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4921]: I0318 12:11:35.939211 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.042424 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.042480 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.042505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.042539 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.042561 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.146554 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.146599 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.146612 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.146655 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.146666 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.208220 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.208425 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.208932 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.209090 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.209220 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.209386 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.211455 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:36 crc kubenswrapper[4921]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 18 12:11:36 crc kubenswrapper[4921]: while [ true ]; Mar 18 12:11:36 crc kubenswrapper[4921]: do Mar 18 12:11:36 crc kubenswrapper[4921]: for f in $(ls /tmp/serviceca); do Mar 18 12:11:36 crc kubenswrapper[4921]: echo $f Mar 18 12:11:36 crc kubenswrapper[4921]: ca_file_path="/tmp/serviceca/${f}" Mar 18 12:11:36 crc kubenswrapper[4921]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 18 12:11:36 crc kubenswrapper[4921]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 18 12:11:36 crc kubenswrapper[4921]: if [ -e "${reg_dir_path}" ]; then Mar 18 12:11:36 crc kubenswrapper[4921]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 18 12:11:36 crc kubenswrapper[4921]: else Mar 18 12:11:36 crc kubenswrapper[4921]: mkdir $reg_dir_path Mar 18 12:11:36 crc kubenswrapper[4921]: cp $ca_file_path $reg_dir_path/ca.crt Mar 18 12:11:36 crc kubenswrapper[4921]: fi Mar 18 12:11:36 crc kubenswrapper[4921]: done Mar 18 12:11:36 crc kubenswrapper[4921]: for d in $(ls /etc/docker/certs.d); do Mar 18 12:11:36 crc kubenswrapper[4921]: echo $d Mar 18 12:11:36 crc kubenswrapper[4921]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 18 12:11:36 crc kubenswrapper[4921]: reg_conf_path="/tmp/serviceca/${dp}" Mar 18 12:11:36 crc kubenswrapper[4921]: if [ ! -e "${reg_conf_path}" ]; then Mar 18 12:11:36 crc kubenswrapper[4921]: rm -rf /etc/docker/certs.d/$d Mar 18 12:11:36 crc kubenswrapper[4921]: fi Mar 18 12:11:36 crc kubenswrapper[4921]: done Mar 18 12:11:36 crc kubenswrapper[4921]: sleep 60 & wait ${!} Mar 18 12:11:36 crc kubenswrapper[4921]: done Mar 18 12:11:36 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwvjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-8s5jb_openshift-image-registry(43869354-ed6a-464a-8c55-8b05e3c9dc82): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:36 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.211661 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:36 crc kubenswrapper[4921]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:36 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:36 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:36 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:36 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:36 crc kubenswrapper[4921]: fi Mar 18 12:11:36 crc kubenswrapper[4921]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:36 crc kubenswrapper[4921]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:36 crc kubenswrapper[4921]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:36 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:36 crc kubenswrapper[4921]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:36 crc kubenswrapper[4921]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:36 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:36 crc kubenswrapper[4921]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:36 crc kubenswrapper[4921]: --webhook-host=127.0.0.1 \ Mar 18 12:11:36 crc kubenswrapper[4921]: --webhook-port=9743 \ Mar 18 12:11:36 crc kubenswrapper[4921]: ${ho_enable} \ Mar 18 12:11:36 crc kubenswrapper[4921]: --enable-interconnect \ Mar 18 12:11:36 crc kubenswrapper[4921]: --disable-approver \ Mar 18 12:11:36 crc kubenswrapper[4921]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:36 crc kubenswrapper[4921]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:36 crc kubenswrapper[4921]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:36 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:36 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:36 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.212880 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-8s5jb" podUID="43869354-ed6a-464a-8c55-8b05e3c9dc82" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.213819 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:36 crc kubenswrapper[4921]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:36 crc kubenswrapper[4921]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:36 crc kubenswrapper[4921]: set -o allexport Mar 18 12:11:36 crc kubenswrapper[4921]: source "/env/_master" Mar 18 12:11:36 crc kubenswrapper[4921]: set +o allexport Mar 18 12:11:36 crc kubenswrapper[4921]: fi Mar 18 12:11:36 crc kubenswrapper[4921]: Mar 18 12:11:36 crc kubenswrapper[4921]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:36 crc kubenswrapper[4921]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:36 crc kubenswrapper[4921]: --disable-webhook \ Mar 18 12:11:36 crc kubenswrapper[4921]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:36 crc kubenswrapper[4921]: --loglevel="${LOGLEVEL}" Mar 18 12:11:36 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:36 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:11:36 crc kubenswrapper[4921]: E0318 12:11:36.215046 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.250084 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.250153 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.250165 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.250185 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.250197 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.352770 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.352820 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.352830 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.352846 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.352857 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.434882 4921 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.456042 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.456100 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.456130 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.456150 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.456164 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.560353 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.560418 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.560431 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.560453 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.560463 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.664481 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.664539 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.664557 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.664585 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.664604 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.767899 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.767962 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.767980 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.768008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.768029 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.872089 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.872171 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.872183 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.872204 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.872219 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.975862 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.975917 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.975938 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.975961 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4921]: I0318 12:11:36.975978 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.078984 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.079057 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.079069 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.079089 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.079099 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.182386 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.182459 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.182479 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.182510 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.182531 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.209022 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:37 crc kubenswrapper[4921]: E0318 12:11:37.209214 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.286568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.287034 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.287053 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.287079 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.287098 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.346993 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:37 crc kubenswrapper[4921]: E0318 12:11:37.347224 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:37 crc kubenswrapper[4921]: E0318 12:11:37.347333 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs podName:b6113b50-f73b-4839-ae08-2bc7d4abb024 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:45.34730059 +0000 UTC m=+124.897221289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs") pod "network-metrics-daemon-g8926" (UID: "b6113b50-f73b-4839-ae08-2bc7d4abb024") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.390516 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.390613 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.390633 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.390658 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.390676 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.494194 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.494247 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.494262 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.494287 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.494310 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.597972 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.598036 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.598060 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.598090 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.598141 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.701489 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.701546 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.701568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.701598 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.701618 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.804727 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.804798 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.804822 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.804848 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.804864 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.908631 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.908683 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.908694 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.908711 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4921]: I0318 12:11:37.908725 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.011735 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.011791 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.011808 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.011831 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.011851 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.114686 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.114724 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.114735 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.114752 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.114762 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.208950 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.209007 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.208960 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:38 crc kubenswrapper[4921]: E0318 12:11:38.209061 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:38 crc kubenswrapper[4921]: E0318 12:11:38.209147 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:38 crc kubenswrapper[4921]: E0318 12:11:38.209248 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.217518 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.217565 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.217582 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.217606 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.217624 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.320264 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.320342 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.320364 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.320393 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.320419 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.422922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.422963 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.422979 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.423000 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.423013 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.526538 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.526639 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.526657 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.526681 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.526697 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.629558 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.629684 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.629710 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.629739 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.629763 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.731649 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.731748 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.731769 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.731794 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.731810 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.834714 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.834758 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.834767 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.834782 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.834794 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.937616 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.937703 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.937737 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.937754 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4921]: I0318 12:11:38.937766 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.040410 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.040458 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.040480 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.040505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.040518 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.143100 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.143179 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.143191 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.143230 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.143243 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.208336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.208695 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.246358 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.246415 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.246432 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.246456 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.246477 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.348736 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.348780 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.348791 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.348807 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.348817 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.451604 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.451649 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.451665 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.451682 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.451695 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.469180 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.469212 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.469220 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.469232 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.469240 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.479227 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.482939 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.482984 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.482996 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.483013 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.483024 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.497325 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.501606 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.501668 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.501691 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.501723 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.501744 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.517239 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.521517 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.521554 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.521564 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.521580 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.521590 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.535773 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.539370 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.539409 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.539421 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.539436 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.539446 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.550422 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5a791cc0-dddf-47b6-8995-f4a6b294e6ba\\\",\\\"systemUUID\\\":\\\"0e096851-48c9-4cbf-9c0d-a42cb1e79e38\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: E0318 12:11:39.550682 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.554269 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.554321 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.554335 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.554353 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.554365 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.578258 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.588783 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.598932 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.608751 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.618822 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.628702 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.640232 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.649389 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.661587 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.661645 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.661658 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.661707 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.661720 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.674703 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.690838 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.701874 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.713475 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.732306 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.735348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.735401 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.742100 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.753192 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.762481 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.765542 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.765594 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.765606 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.765623 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.765635 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.770470 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.779951 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.787168 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.796497 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.807812 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.819183 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.833781 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.843018 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.852172 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.860875 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.868670 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.868705 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.868716 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.868752 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.868766 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.871159 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.880323 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.889595 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.896409 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.906154 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.970993 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.971203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.971233 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.971258 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4921]: I0318 12:11:39.971275 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.074934 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.074983 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.074994 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.075010 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.075025 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.178025 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.178064 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.178075 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.178091 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.178102 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.208847 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.208951 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.209163 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:40 crc kubenswrapper[4921]: E0318 12:11:40.209341 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:40 crc kubenswrapper[4921]: E0318 12:11:40.209869 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:40 crc kubenswrapper[4921]: E0318 12:11:40.209976 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.281554 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.281592 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.281601 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.281647 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.281659 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.384269 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.384308 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.384316 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.384330 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.384339 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.487103 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.487172 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.487188 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.487205 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.487216 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.590828 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.590909 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.590931 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.590961 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.590983 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.694660 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.694728 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.694751 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.694785 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.694808 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.740764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.758339 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.773533 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.786429 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.797203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.797244 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.797259 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.797279 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.797291 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.802881 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.815457 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.832307 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.844395 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.863420 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.875362 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.889782 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.901195 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.901258 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.901278 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.901304 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.901324 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.902966 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.920314 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.938583 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.957852 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:40 crc kubenswrapper[4921]: I0318 12:11:40.978170 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.005263 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.005297 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.005310 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.005327 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.005339 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.108586 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.108627 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.108640 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.108656 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.108666 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4921]: E0318 12:11:41.208844 4921 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.209161 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:41 crc kubenswrapper[4921]: E0318 12:11:41.209388 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.222374 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.231818 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.246237 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.254879 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.267007 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.276041 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.287338 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.297465 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.307812 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: E0318 12:11:41.317094 4921 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.333011 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.341884 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.352637 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.360346 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.369033 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.377464 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.746277 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" event={"ID":"2119089d-3cd5-4294-8e34-f5dcb27e0e34","Type":"ContainerStarted","Data":"08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11"} Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.746354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" event={"ID":"2119089d-3cd5-4294-8e34-f5dcb27e0e34","Type":"ContainerStarted","Data":"146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac"} Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.769536 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.785150 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.799017 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.812966 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.823623 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.837633 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.847231 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.861694 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.877255 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.899315 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.912487 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.928645 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.943803 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.958167 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:41 crc kubenswrapper[4921]: I0318 12:11:41.969282 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.204876 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.205025 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.205073 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.205133 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205172 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:14.205104762 +0000 UTC m=+153.755025441 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.205224 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205293 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205524 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205540 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205293 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205599 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205611 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205660 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:14.205642887 +0000 UTC m=+153.755563536 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205390 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205396 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205812 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:14.205787831 +0000 UTC m=+153.755708540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.205875 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:14.205849232 +0000 UTC m=+153.755769961 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.206102 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:14.206090189 +0000 UTC m=+153.756010838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.209023 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.209245 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.209306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.209443 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:42 crc kubenswrapper[4921]: I0318 12:11:42.209306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:42 crc kubenswrapper[4921]: E0318 12:11:42.209509 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:43 crc kubenswrapper[4921]: I0318 12:11:43.208481 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:43 crc kubenswrapper[4921]: E0318 12:11:43.208676 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.208448 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.208464 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.208495 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:44 crc kubenswrapper[4921]: E0318 12:11:44.208945 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:44 crc kubenswrapper[4921]: E0318 12:11:44.209764 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:44 crc kubenswrapper[4921]: E0318 12:11:44.209682 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.760010 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d580e16ad0c562bb06096fbab648aee5e3fe7b01643d49bbfa40f73ffeb7722e"} Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.761659 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gkdzx" event={"ID":"888e124c-ec0f-4c32-bd78-1ff258933bde","Type":"ContainerStarted","Data":"1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878"} Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.772878 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.782947 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.801491 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.810981 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.822984 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.831067 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.840519 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.851513 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.863252 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.882589 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.891467 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.900888 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.911853 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d580e16ad0c562bb06096fbab648aee5e3fe7b01643d49bbfa40f73ffeb7722e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.921798 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.932826 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.950618 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.961425 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.980850 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:44 crc kubenswrapper[4921]: I0318 12:11:44.990065 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.011024 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.021365 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.029707 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.041512 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.051712 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.064854 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.084945 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.093516 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.103109 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d580e16ad0c562bb06096fbab648aee5e3fe7b01643d49bbfa40f73ffeb7722e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.112499 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.123958 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.208998 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:45 crc kubenswrapper[4921]: E0318 12:11:45.209244 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.441505 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:45 crc kubenswrapper[4921]: E0318 12:11:45.441693 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:45 crc kubenswrapper[4921]: E0318 12:11:45.442148 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs podName:b6113b50-f73b-4839-ae08-2bc7d4abb024 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:01.442128087 +0000 UTC m=+140.992048726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs") pod "network-metrics-daemon-g8926" (UID: "b6113b50-f73b-4839-ae08-2bc7d4abb024") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.766357 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ms8h" event={"ID":"576db9cc-6447-4ab5-b6d4-b9b68e48167e","Type":"ContainerStarted","Data":"eb425a6457bd58c147702e4a7dfc9e77b10cfe3cf031b64b093e1bbfcbcb511d"} Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.768146 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" exitCode=0 Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.768194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.778053 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.786773 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.800372 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.810231 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb425a6457bd58c147702e4a7dfc9e77b10cfe3cf031b64b093e1bbfcbcb511d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.823482 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.831574 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.839410 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.852374 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.862055 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.872893 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.888608 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.906159 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.914910 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d580e16ad0c562bb06096fbab648aee5e3fe7b01643d49bbfa40f73ffeb7722e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.931902 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.939953 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.956205 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d580e16ad0c562bb06096fbab648aee5e3fe7b01643d49bbfa40f73ffeb7722e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.975301 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4921]: I0318 12:11:45.990867 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.009544 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.017440 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.030936 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.040434 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb425a6457bd58c147702e4a7dfc9e77b10cfe3cf031b64b093e1bbfcbcb511d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.052968 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.071660 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.080967 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.092232 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.108639 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.120211 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.132228 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.144631 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.208560 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:46 crc kubenswrapper[4921]: E0318 12:11:46.208699 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.208911 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:46 crc kubenswrapper[4921]: E0318 12:11:46.208998 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.209183 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:46 crc kubenswrapper[4921]: E0318 12:11:46.209310 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:46 crc kubenswrapper[4921]: E0318 12:11:46.317902 4921 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.773458 4921 generic.go:334] "Generic (PLEG): container finished" podID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" containerID="73b7b389796941d7e48d8a3e7f090c4830dddc20752a184ac664981435ea3af0" exitCode=0 Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.773531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerDied","Data":"73b7b389796941d7e48d8a3e7f090c4830dddc20752a184ac664981435ea3af0"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.779509 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.779562 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.779575 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.779584 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.779592 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.779600 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.794006 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3975745-b934-4ee7-9835-15eaeb9e2931\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:38Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:10:38.579892 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:10:38.580022 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:10:38.580694 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-524821673/tls.crt::/tmp/serving-cert-524821673/tls.key\\\\\\\"\\\\nI0318 12:10:38.825921 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:10:38.831205 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:10:38.831240 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:10:38.831265 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:10:38.831271 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:10:38.838249 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:10:38.838284 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838290 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:10:38.838295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:10:38.838298 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:10:38.838324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:10:38.838327 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:10:38.838419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:10:38.841543 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:09:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:09:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.805435 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2ms8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"576db9cc-6447-4ab5-b6d4-b9b68e48167e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb425a6457bd58c147702e4a7dfc9e77b10cfe3cf031b64b093e1bbfcbcb511d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx5k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2ms8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.818762 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b7b389796941d7e48d8a3e7f090c4830dddc20752a184ac664981435ea3af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b7b389796941d7e48d8a3e7f090c4830dddc20752a184ac664981435ea3af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.828508 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.838638 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.849691 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gkdzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888e124c-ec0f-4c32-bd78-1ff258933bde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7f5bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gkdzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.866159 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357e939f-66df-4ef0-b64a-a846abdd1ecf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnh75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l6tb7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.874516 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8s5jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43869354-ed6a-464a-8c55-8b05e3c9dc82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwvjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8s5jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.883584 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2119089d-3cd5-4294-8e34-f5dcb27e0e34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://146b76e981ecf8cd9d7304b7caf9760c21818657d7f380f7fd305754e754f0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08297f8ca878733c76bd12121985918d2b9547dec3cd240ab00907106f74bd11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqwnz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6t6wg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.890255 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g8926" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6113b50-f73b-4839-ae08-2bc7d4abb024\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-msrt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g8926\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.899349 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d580e16ad0c562bb06096fbab648aee5e3fe7b01643d49bbfa40f73ffeb7722e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.910568 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.921197 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"509553d8-b894-456c-a45e-665e8497cdbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://152ae88a671bf7a4657d22f9b01994608e42aa3d6fe6028dc1bf2e5a588e8365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xg55l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fsfj7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.929708 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:46 crc kubenswrapper[4921]: I0318 12:11:46.938465 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f659ea8f2ab91ef06a5de4ed72c3c546072e9afce1ded73c2e027a60183a513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.209367 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:47 crc kubenswrapper[4921]: E0318 12:11:47.209746 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.224419 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.787491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d6f43388a61681669f27815642b6b59701d784e9caf15f8d001e67a90a3451b8"} Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.787885 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fac5f20e562a9d618f46be5decb8d27d421f8dab1f69f13a5f009f32b935c354"} Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.789420 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8s5jb" event={"ID":"43869354-ed6a-464a-8c55-8b05e3c9dc82","Type":"ContainerStarted","Data":"2743549e247eb7183f4555fa4fed25e25a52db31c3bfbb8b8c90bee1368f0e26"} Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.791448 4921 generic.go:334] "Generic (PLEG): container finished" podID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" containerID="ce2eb39f09fb6f244f746b8347dc0ee21f70492b58abe504369fdc2f377d11a8" exitCode=0 Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.791523 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerDied","Data":"ce2eb39f09fb6f244f746b8347dc0ee21f70492b58abe504369fdc2f377d11a8"} Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.820279 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fba8acfc-17d7-4738-9e7b-58d51c0c8085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b7b389796941d7e48d8a3e7f090c4830dddc20752a184ac664981435ea3af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b7b389796941d7e48d8a3e7f090c4830dddc20752a184ac664981435ea3af0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2dh7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:16Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-46nj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.868709 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2ms8h" podStartSLOduration=70.868690437 podStartE2EDuration="1m10.868690437s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:47.86844817 +0000 UTC m=+127.418368809" watchObservedRunningTime="2026-03-18 12:11:47.868690437 +0000 UTC m=+127.418611076" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.868895 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=37.868889482 podStartE2EDuration="37.868889482s" podCreationTimestamp="2026-03-18 12:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:47.858411692 +0000 UTC m=+127.408332341" watchObservedRunningTime="2026-03-18 12:11:47.868889482 +0000 UTC m=+127.418810121" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.899197 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gkdzx" podStartSLOduration=69.899174881 podStartE2EDuration="1m9.899174881s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:47.898881964 +0000 UTC m=+127.448802603" watchObservedRunningTime="2026-03-18 12:11:47.899174881 +0000 UTC m=+127.449095530" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.954105 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6t6wg" podStartSLOduration=69.954090759 podStartE2EDuration="1m9.954090759s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:47.944280957 +0000 UTC m=+127.494201596" watchObservedRunningTime="2026-03-18 12:11:47.954090759 +0000 UTC m=+127.504011398" Mar 18 12:11:47 crc kubenswrapper[4921]: I0318 12:11:47.964934 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.964914748 podStartE2EDuration="964.914748ms" podCreationTimestamp="2026-03-18 12:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:47.964402444 +0000 UTC m=+127.514323083" watchObservedRunningTime="2026-03-18 12:11:47.964914748 +0000 UTC m=+127.514835387" Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.004674 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podStartSLOduration=71.00465587 podStartE2EDuration="1m11.00465587s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:48.004547177 +0000 UTC m=+127.554467816" watchObservedRunningTime="2026-03-18 12:11:48.00465587 +0000 UTC m=+127.554576519" Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.077487 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8s5jb" podStartSLOduration=71.077467956 podStartE2EDuration="1m11.077467956s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:48.076462389 +0000 UTC m=+127.626383028" watchObservedRunningTime="2026-03-18 12:11:48.077467956 +0000 UTC m=+127.627388595" Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.208752 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.208831 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:48 crc kubenswrapper[4921]: E0318 12:11:48.208909 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.208752 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:48 crc kubenswrapper[4921]: E0318 12:11:48.209174 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:48 crc kubenswrapper[4921]: E0318 12:11:48.209391 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.801107 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.804095 4921 generic.go:334] "Generic (PLEG): container finished" podID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" containerID="ad28a07f1cbe50f8c7e4411a10c6162b9ba9ac71052aa7fccc5227cbb63e8870" exitCode=0 Mar 18 12:11:48 crc kubenswrapper[4921]: I0318 12:11:48.804192 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerDied","Data":"ad28a07f1cbe50f8c7e4411a10c6162b9ba9ac71052aa7fccc5227cbb63e8870"} Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.208431 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:49 crc kubenswrapper[4921]: E0318 12:11:49.208649 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.660927 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.660972 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.660981 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.660994 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.661003 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.722232 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv"] Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.722645 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.724608 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.725080 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.726263 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.726547 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.809625 4921 generic.go:334] "Generic (PLEG): container finished" podID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" containerID="9e0dafcbe4e503cf9183243794a9fc59e6267d76baf4c29edf86a8ccbd788b44" exitCode=0 Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.809696 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerDied","Data":"9e0dafcbe4e503cf9183243794a9fc59e6267d76baf4c29edf86a8ccbd788b44"} Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.887283 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.887343 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.887390 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.887444 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.887475 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.988560 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.988961 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.988982 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.989018 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.989046 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.989155 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.989304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.990221 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:49 crc kubenswrapper[4921]: I0318 12:11:49.998764 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.006645 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0e970aa-7a9c-4019-a5b0-7b30aa57515b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dffxv\" (UID: \"b0e970aa-7a9c-4019-a5b0-7b30aa57515b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.049431 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" Mar 18 12:11:50 crc kubenswrapper[4921]: W0318 12:11:50.075725 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e970aa_7a9c_4019_a5b0_7b30aa57515b.slice/crio-7c35954310ad5bdf76336aefe3bd8d0eaffe67c53c44d646c32f7dd9a2b1371b WatchSource:0}: Error finding container 7c35954310ad5bdf76336aefe3bd8d0eaffe67c53c44d646c32f7dd9a2b1371b: Status 404 returned error can't find the container with id 7c35954310ad5bdf76336aefe3bd8d0eaffe67c53c44d646c32f7dd9a2b1371b Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.208066 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:50 crc kubenswrapper[4921]: E0318 12:11:50.208245 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.208289 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.208289 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:50 crc kubenswrapper[4921]: E0318 12:11:50.208413 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:50 crc kubenswrapper[4921]: E0318 12:11:50.208449 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.216896 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.226245 4921 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.818194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerStarted","Data":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.818391 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.818424 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.821493 4921 generic.go:334] "Generic (PLEG): container finished" podID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" containerID="cc4f95f978c227de1f04abc364a54a5f3125d1102a86a4325eb7d890cefb4914" exitCode=0 Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.821558 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerDied","Data":"cc4f95f978c227de1f04abc364a54a5f3125d1102a86a4325eb7d890cefb4914"} Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.823153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" event={"ID":"b0e970aa-7a9c-4019-a5b0-7b30aa57515b","Type":"ContainerStarted","Data":"622bda24048db23af2fe1f2eb4a8b7be5a53a5965c2c5e6bb175438a5580c24c"} Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.823183 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" event={"ID":"b0e970aa-7a9c-4019-a5b0-7b30aa57515b","Type":"ContainerStarted","Data":"7c35954310ad5bdf76336aefe3bd8d0eaffe67c53c44d646c32f7dd9a2b1371b"} Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.850233 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.864977 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podStartSLOduration=72.864938268 podStartE2EDuration="1m12.864938268s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:50.845637892 +0000 UTC m=+130.395558541" watchObservedRunningTime="2026-03-18 12:11:50.864938268 +0000 UTC m=+130.414858907" Mar 18 12:11:50 crc kubenswrapper[4921]: I0318 12:11:50.876581 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dffxv" podStartSLOduration=73.876566249 podStartE2EDuration="1m13.876566249s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:50.875727766 +0000 UTC m=+130.425648405" watchObservedRunningTime="2026-03-18 12:11:50.876566249 +0000 UTC m=+130.426486888" Mar 18 12:11:51 crc kubenswrapper[4921]: I0318 12:11:51.208699 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:51 crc kubenswrapper[4921]: E0318 12:11:51.209370 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:51 crc kubenswrapper[4921]: E0318 12:11:51.319517 4921 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:11:51 crc kubenswrapper[4921]: I0318 12:11:51.831074 4921 generic.go:334] "Generic (PLEG): container finished" podID="fba8acfc-17d7-4738-9e7b-58d51c0c8085" containerID="f21709a89c62469757ccd04645342d413f69d772a05a0f73cd5508fdcc236f41" exitCode=0 Mar 18 12:11:51 crc kubenswrapper[4921]: I0318 12:11:51.831173 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerDied","Data":"f21709a89c62469757ccd04645342d413f69d772a05a0f73cd5508fdcc236f41"} Mar 18 12:11:51 crc kubenswrapper[4921]: I0318 12:11:51.831857 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:51 crc kubenswrapper[4921]: I0318 12:11:51.864918 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.208412 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.208446 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.208462 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:52 crc kubenswrapper[4921]: E0318 12:11:52.208518 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:52 crc kubenswrapper[4921]: E0318 12:11:52.208685 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:52 crc kubenswrapper[4921]: E0318 12:11:52.209140 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.840193 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-46nj9" event={"ID":"fba8acfc-17d7-4738-9e7b-58d51c0c8085","Type":"ContainerStarted","Data":"e1df6595c28602fabb634d05e914ac3974ba5768982fbb8b45673d8b9d5d3e6e"} Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.862011 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-46nj9" podStartSLOduration=74.861992752 podStartE2EDuration="1m14.861992752s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:11:52.861771186 +0000 UTC m=+132.411691835" watchObservedRunningTime="2026-03-18 12:11:52.861992752 +0000 UTC m=+132.411913391" Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.944676 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g8926"] Mar 18 12:11:52 crc kubenswrapper[4921]: I0318 12:11:52.944782 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:52 crc kubenswrapper[4921]: E0318 12:11:52.944884 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:54 crc kubenswrapper[4921]: I0318 12:11:54.208086 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:54 crc kubenswrapper[4921]: E0318 12:11:54.208505 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:54 crc kubenswrapper[4921]: I0318 12:11:54.208254 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:54 crc kubenswrapper[4921]: I0318 12:11:54.208164 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:54 crc kubenswrapper[4921]: E0318 12:11:54.208585 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:54 crc kubenswrapper[4921]: I0318 12:11:54.208254 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:54 crc kubenswrapper[4921]: E0318 12:11:54.208731 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:54 crc kubenswrapper[4921]: E0318 12:11:54.208839 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:56 crc kubenswrapper[4921]: I0318 12:11:56.208614 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:56 crc kubenswrapper[4921]: I0318 12:11:56.208617 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:56 crc kubenswrapper[4921]: E0318 12:11:56.209439 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:56 crc kubenswrapper[4921]: I0318 12:11:56.208685 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:56 crc kubenswrapper[4921]: I0318 12:11:56.208678 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:56 crc kubenswrapper[4921]: E0318 12:11:56.209591 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g8926" podUID="b6113b50-f73b-4839-ae08-2bc7d4abb024" Mar 18 12:11:56 crc kubenswrapper[4921]: E0318 12:11:56.209722 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:56 crc kubenswrapper[4921]: E0318 12:11:56.209905 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.208524 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.208699 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.208744 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.208800 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.210918 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.211025 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.212723 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.213235 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.215150 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:11:58 crc kubenswrapper[4921]: I0318 12:11:58.217766 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:11:59 crc kubenswrapper[4921]: I0318 12:11:59.966312 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.017267 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pwwdq"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.017911 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.020671 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.021336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.021799 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zt5mh"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.022635 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z5bqz"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.022836 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.023257 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.023423 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.024127 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.025739 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gfrt5"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.026296 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.030547 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.031021 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.052600 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.052683 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.053974 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.054659 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.061297 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.065062 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.066083 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.066625 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.068645 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.068912 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.072031 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.072284 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.077290 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.077409 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.077559 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.078320 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.079707 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.080079 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.080663 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.080896 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.091711 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.092209 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.092379 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093017 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093154 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093402 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093510 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093693 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093830 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093843 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.093985 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.094007 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.094098 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.094381 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.094509 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.094655 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.094776 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095088 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095103 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095303 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095428 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095567 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095676 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095813 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.095936 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.096150 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.096272 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.096383 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.096593 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.096819 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.097018 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.097580 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8s2wr"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.097973 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9qfns"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.098208 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-swhkn"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.098512 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.098912 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.099154 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.099233 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101012 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101049 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101207 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101228 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.099388 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101375 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101758 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101922 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-client-ca\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-etcd-serving-ca\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101996 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3888977f-a875-4d0b-85c1-fb3033156ef7-serving-cert\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102014 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-image-import-ca\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102033 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxk4\" (UniqueName: \"kubernetes.io/projected/b98da0c2-cddf-4701-8703-6821fe2bb520-kube-api-access-vrxk4\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102045 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102052 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102083 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f012fd-823f-400e-9af8-9a2264f18589-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102100 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-etcd-client\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102138 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.099763 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.101432 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102511 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102349 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102714 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43a2af0e-c364-40a7-b654-966a74211add-audit-dir\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.102742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86f3c6a-1db9-44c8-911c-46647c933bd7-config\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103408 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-encryption-config\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103432 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/43a2af0e-c364-40a7-b654-966a74211add-node-pullsecrets\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103448 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-audit\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103468 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmx65\" (UniqueName: \"kubernetes.io/projected/b86f3c6a-1db9-44c8-911c-46647c933bd7-kube-api-access-lmx65\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103487 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbcw\" (UniqueName: \"kubernetes.io/projected/c8f1ed20-40fc-4010-be16-03815fce6b82-kube-api-access-2jbcw\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103512 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3888977f-a875-4d0b-85c1-fb3033156ef7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103532 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f012fd-823f-400e-9af8-9a2264f18589-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103551 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-audit-policies\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103568 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-audit-dir\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103591 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-serving-cert\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-serving-cert\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103644 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-config\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103662 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-encryption-config\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-config\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103709 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f1ed20-40fc-4010-be16-03815fce6b82-serving-cert\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103737 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tx7\" (UniqueName: \"kubernetes.io/projected/3888977f-a875-4d0b-85c1-fb3033156ef7-kube-api-access-c2tx7\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103764 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-config\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznmw\" (UniqueName: \"kubernetes.io/projected/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-kube-api-access-zznmw\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103821 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-serving-cert\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.103890 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b86f3c6a-1db9-44c8-911c-46647c933bd7-images\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104002 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-client-ca\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv48s\" (UniqueName: \"kubernetes.io/projected/d4f012fd-823f-400e-9af8-9a2264f18589-kube-api-access-jv48s\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104084 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgcgc\" (UniqueName: \"kubernetes.io/projected/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-kube-api-access-mgcgc\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104123 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b86f3c6a-1db9-44c8-911c-46647c933bd7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104164 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qc7w\" (UniqueName: \"kubernetes.io/projected/650c2009-2949-4ba8-ad59-8188c8a9523b-kube-api-access-7qc7w\") pod \"cluster-samples-operator-665b6dd947-bkknx\" (UID: \"650c2009-2949-4ba8-ad59-8188c8a9523b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104189 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98da0c2-cddf-4701-8703-6821fe2bb520-serving-cert\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104246 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pqz\" (UniqueName: \"kubernetes.io/projected/43a2af0e-c364-40a7-b654-966a74211add-kube-api-access-w4pqz\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104268 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-service-ca-bundle\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104298 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/650c2009-2949-4ba8-ad59-8188c8a9523b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bkknx\" (UID: \"650c2009-2949-4ba8-ad59-8188c8a9523b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104334 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-etcd-client\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104375 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-config\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.104441 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.106701 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.106777 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.106828 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xl9c9"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.107539 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.110889 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vxtr2"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.111499 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.111967 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.112255 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.112352 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.112931 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.113455 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.114295 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz82p"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.114631 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.115287 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pwwdq"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.117281 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-52p2z"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.117940 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.118340 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.128843 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.130588 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.135718 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.136630 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.136742 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.136948 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137045 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137151 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137307 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137509 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137570 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137874 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.137921 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.138217 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.138669 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.138671 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.139818 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.138920 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.140673 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.140851 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.142625 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x4bzs"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.147923 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.149391 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.149741 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.149914 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.150076 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.150597 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.150810 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.151057 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.151358 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.151476 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.152410 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.152442 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.153142 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.153414 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.153756 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.154091 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.163564 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.163767 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.163944 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.164103 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.164264 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.164422 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.164431 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.164546 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.164704 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.167355 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.167530 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.167792 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.167926 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.169572 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.169763 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.171706 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.172474 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.172499 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.172633 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.172781 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.172873 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.177037 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.178570 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4lm"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.179417 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.179522 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.179957 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.180206 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.180548 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.180700 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.181153 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4qdcl"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.181864 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.183536 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.184436 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qcvdl"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.184944 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.185166 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.185223 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.185994 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.186721 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dk62n"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.187132 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.191078 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.191743 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.192288 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.193281 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.193443 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.195170 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.196185 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.197781 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.197852 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.200993 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z5bqz"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.201208 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.204248 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-serving-cert\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205443 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205468 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b86f3c6a-1db9-44c8-911c-46647c933bd7-images\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205502 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-serving-cert\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205585 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-client-ca\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205613 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv48s\" (UniqueName: \"kubernetes.io/projected/d4f012fd-823f-400e-9af8-9a2264f18589-kube-api-access-jv48s\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205651 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5658a1-dd31-41cc-a12b-653abf972154-serving-cert\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205676 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-config\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205701 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qc7w\" (UniqueName: \"kubernetes.io/projected/650c2009-2949-4ba8-ad59-8188c8a9523b-kube-api-access-7qc7w\") pod \"cluster-samples-operator-665b6dd947-bkknx\" (UID: \"650c2009-2949-4ba8-ad59-8188c8a9523b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205727 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgcgc\" (UniqueName: \"kubernetes.io/projected/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-kube-api-access-mgcgc\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205747 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b86f3c6a-1db9-44c8-911c-46647c933bd7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205786 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98da0c2-cddf-4701-8703-6821fe2bb520-serving-cert\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205809 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-ca\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205828 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pqz\" (UniqueName: \"kubernetes.io/projected/43a2af0e-c364-40a7-b654-966a74211add-kube-api-access-w4pqz\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205849 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-oauth-serving-cert\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205873 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-service-ca-bundle\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205915 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/650c2009-2949-4ba8-ad59-8188c8a9523b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bkknx\" (UID: \"650c2009-2949-4ba8-ad59-8188c8a9523b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-etcd-client\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.205958 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-config\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206173 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-client-ca\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206241 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-etcd-serving-ca\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-service-ca\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206284 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3888977f-a875-4d0b-85c1-fb3033156ef7-serving-cert\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-image-import-ca\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206346 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxk4\" (UniqueName: \"kubernetes.io/projected/b98da0c2-cddf-4701-8703-6821fe2bb520-kube-api-access-vrxk4\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206387 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206412 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5k6g\" (UniqueName: \"kubernetes.io/projected/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-kube-api-access-h5k6g\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206445 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f012fd-823f-400e-9af8-9a2264f18589-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206470 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-etcd-client\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206488 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206521 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-oauth-config\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgl7t\" (UniqueName: \"kubernetes.io/projected/8c5658a1-dd31-41cc-a12b-653abf972154-kube-api-access-wgl7t\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206563 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43a2af0e-c364-40a7-b654-966a74211add-audit-dir\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206590 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86f3c6a-1db9-44c8-911c-46647c933bd7-config\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206762 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-encryption-config\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/43a2af0e-c364-40a7-b654-966a74211add-node-pullsecrets\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206840 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-audit\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206862 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmx65\" (UniqueName: \"kubernetes.io/projected/b86f3c6a-1db9-44c8-911c-46647c933bd7-kube-api-access-lmx65\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206919 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3888977f-a875-4d0b-85c1-fb3033156ef7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206941 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f012fd-823f-400e-9af8-9a2264f18589-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206963 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbcw\" (UniqueName: \"kubernetes.io/projected/c8f1ed20-40fc-4010-be16-03815fce6b82-kube-api-access-2jbcw\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.206981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-service-ca\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-audit-policies\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207041 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-audit-dir\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207070 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-serving-cert\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207120 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-serving-cert\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207143 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-trusted-ca-bundle\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207164 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-config\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207182 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-config\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207203 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-config\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207349 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-encryption-config\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207377 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-client\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207400 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f1ed20-40fc-4010-be16-03815fce6b82-serving-cert\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207440 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tx7\" (UniqueName: \"kubernetes.io/projected/3888977f-a875-4d0b-85c1-fb3033156ef7-kube-api-access-c2tx7\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207463 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-config\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207485 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznmw\" (UniqueName: \"kubernetes.io/projected/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-kube-api-access-zznmw\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.207905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-etcd-serving-ca\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.209218 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-client-ca\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.210712 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3888977f-a875-4d0b-85c1-fb3033156ef7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.212020 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-image-import-ca\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.213905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-audit-dir\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.215523 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.215617 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b86f3c6a-1db9-44c8-911c-46647c933bd7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.215801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3888977f-a875-4d0b-85c1-fb3033156ef7-serving-cert\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.216063 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-config\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.216370 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-serving-cert\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.216562 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-audit-policies\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.216709 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-service-ca-bundle\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.216917 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.217005 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.219474 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dkdbz"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.220575 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f012fd-823f-400e-9af8-9a2264f18589-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.220684 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98da0c2-cddf-4701-8703-6821fe2bb520-serving-cert\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.225225 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5pf85"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.225320 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.227983 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8s2wr"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.228187 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.229028 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-swhkn"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.229879 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/43a2af0e-c364-40a7-b654-966a74211add-audit-dir\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.230972 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.231218 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/43a2af0e-c364-40a7-b654-966a74211add-node-pullsecrets\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.231872 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b86f3c6a-1db9-44c8-911c-46647c933bd7-images\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.232942 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4f012fd-823f-400e-9af8-9a2264f18589-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.233466 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8f1ed20-40fc-4010-be16-03815fce6b82-config\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.233606 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86f3c6a-1db9-44c8-911c-46647c933bd7-config\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.233998 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/650c2009-2949-4ba8-ad59-8188c8a9523b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bkknx\" (UID: \"650c2009-2949-4ba8-ad59-8188c8a9523b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.234089 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-audit\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.234447 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-client-ca\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.234901 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43a2af0e-c364-40a7-b654-966a74211add-config\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.236187 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.237155 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-serving-cert\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.237828 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-etcd-client\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.240405 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-encryption-config\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.240735 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-config\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.241303 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-serving-cert\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.241665 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-encryption-config\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.242395 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.243761 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zt5mh"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.244444 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8f1ed20-40fc-4010-be16-03815fce6b82-serving-cert\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.244800 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.246626 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/43a2af0e-c364-40a7-b654-966a74211add-etcd-client\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.246753 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gfrt5"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.246885 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.248070 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9qfns"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.249185 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.250219 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xl9c9"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.250620 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz82p"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.251903 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qcvdl"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.253609 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.254532 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.255574 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.256583 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x4bzs"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.258210 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.260365 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.260887 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.262590 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.264348 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.267221 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.267934 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vxtr2"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.269859 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4lm"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.270890 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.272268 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.273159 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.275750 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.276168 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4qdcl"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.278591 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rrdjn"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.279318 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.280524 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dkdbz"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.281039 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.282452 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.282493 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.284772 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dk62n"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.286743 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563932-b8gp7"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.290324 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-b8gp7"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.290393 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p2tjp"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.290570 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.292018 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-747dt"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.292298 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.293365 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p2tjp"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.293468 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-747dt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.294223 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-747dt"] Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.300689 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308091 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-ca\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308162 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-oauth-serving-cert\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308195 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-service-ca\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308222 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5k6g\" (UniqueName: \"kubernetes.io/projected/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-kube-api-access-h5k6g\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-oauth-config\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308273 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgl7t\" (UniqueName: \"kubernetes.io/projected/8c5658a1-dd31-41cc-a12b-653abf972154-kube-api-access-wgl7t\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-service-ca\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-trusted-ca-bundle\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308366 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-config\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308847 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-client\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308914 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-serving-cert\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308952 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5658a1-dd31-41cc-a12b-653abf972154-serving-cert\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.308973 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-config\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.309881 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-config\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.309965 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-oauth-serving-cert\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.310589 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-service-ca\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.311289 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-trusted-ca-bundle\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.311722 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-config\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.311904 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-service-ca\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.312983 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-serving-cert\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.313973 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-client\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.316331 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c5658a1-dd31-41cc-a12b-653abf972154-etcd-ca\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.317966 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5658a1-dd31-41cc-a12b-653abf972154-serving-cert\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.320665 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.325498 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-oauth-config\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.340883 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.360556 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.381701 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.407411 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.421551 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.441047 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.461627 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.480487 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.500503 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.520649 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.541249 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.560923 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.580941 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.601245 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.620977 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.639985 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.661624 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.680510 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.701592 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.720662 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.741548 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.760477 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.786557 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.821792 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.842144 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.862022 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.880861 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.901941 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.921009 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.941545 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.962150 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:12:00 crc kubenswrapper[4921]: I0318 12:12:00.981983 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.001825 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.021257 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.041372 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.060436 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.081512 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.100815 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.441649 4921 request.go:700] Waited for 1.256244055s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dservice-ca-dockercfg-pn86c&limit=500&resourceVersion=0 Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.447284 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.450073 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.452889 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.455471 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6113b50-f73b-4839-ae08-2bc7d4abb024-metrics-certs\") pod \"network-metrics-daemon-g8926\" (UID: \"b6113b50-f73b-4839-ae08-2bc7d4abb024\") " pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.456230 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.456432 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.456566 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457192 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457368 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457570 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457475 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457492 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457765 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.457844 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.458020 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.458091 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.458208 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.459619 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.463432 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.467877 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.480384 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.500330 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.520948 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.537858 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g8926" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.542433 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.561218 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.581489 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.600862 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.621569 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.642431 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.662364 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.681325 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.701070 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.721143 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.735386 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g8926"] Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.740499 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:12:01 crc kubenswrapper[4921]: W0318 12:12:01.744772 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6113b50_f73b_4839_ae08_2bc7d4abb024.slice/crio-ad06b952a73d718b19e8e01752988347f6ba20bdeb96ee6a42122b78b185c5a7 WatchSource:0}: Error finding container ad06b952a73d718b19e8e01752988347f6ba20bdeb96ee6a42122b78b185c5a7: Status 404 returned error can't find the container with id ad06b952a73d718b19e8e01752988347f6ba20bdeb96ee6a42122b78b185c5a7 Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.760630 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.800181 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.838595 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznmw\" (UniqueName: \"kubernetes.io/projected/cbc87ffe-81eb-46ea-b7cd-d98c56df78c9-kube-api-access-zznmw\") pod \"apiserver-7bbb656c7d-jf9wv\" (UID: \"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.856712 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv48s\" (UniqueName: \"kubernetes.io/projected/d4f012fd-823f-400e-9af8-9a2264f18589-kube-api-access-jv48s\") pod \"openshift-apiserver-operator-796bbdcf4f-b5q4c\" (UID: \"d4f012fd-823f-400e-9af8-9a2264f18589\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.873641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g8926" event={"ID":"b6113b50-f73b-4839-ae08-2bc7d4abb024","Type":"ContainerStarted","Data":"ad06b952a73d718b19e8e01752988347f6ba20bdeb96ee6a42122b78b185c5a7"} Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.875574 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qc7w\" (UniqueName: \"kubernetes.io/projected/650c2009-2949-4ba8-ad59-8188c8a9523b-kube-api-access-7qc7w\") pod \"cluster-samples-operator-665b6dd947-bkknx\" (UID: \"650c2009-2949-4ba8-ad59-8188c8a9523b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.895247 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.897358 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgcgc\" (UniqueName: \"kubernetes.io/projected/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-kube-api-access-mgcgc\") pod \"controller-manager-879f6c89f-pwwdq\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.916759 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxk4\" (UniqueName: \"kubernetes.io/projected/b98da0c2-cddf-4701-8703-6821fe2bb520-kube-api-access-vrxk4\") pod \"route-controller-manager-6576b87f9c-qgx2c\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.935836 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbcw\" (UniqueName: \"kubernetes.io/projected/c8f1ed20-40fc-4010-be16-03815fce6b82-kube-api-access-2jbcw\") pod \"authentication-operator-69f744f599-gfrt5\" (UID: \"c8f1ed20-40fc-4010-be16-03815fce6b82\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.957898 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.959267 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pqz\" (UniqueName: \"kubernetes.io/projected/43a2af0e-c364-40a7-b654-966a74211add-kube-api-access-w4pqz\") pod \"apiserver-76f77b778f-zt5mh\" (UID: \"43a2af0e-c364-40a7-b654-966a74211add\") " pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.962324 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.978669 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" Mar 18 12:12:01 crc kubenswrapper[4921]: I0318 12:12:01.983238 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.000654 4921 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.021501 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.057472 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmx65\" (UniqueName: \"kubernetes.io/projected/b86f3c6a-1db9-44c8-911c-46647c933bd7-kube-api-access-lmx65\") pod \"machine-api-operator-5694c8668f-z5bqz\" (UID: \"b86f3c6a-1db9-44c8-911c-46647c933bd7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.076993 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tx7\" (UniqueName: \"kubernetes.io/projected/3888977f-a875-4d0b-85c1-fb3033156ef7-kube-api-access-c2tx7\") pod \"openshift-config-operator-7777fb866f-2m8pm\" (UID: \"3888977f-a875-4d0b-85c1-fb3033156ef7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.081064 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.101266 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.122009 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.136710 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.140327 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.141231 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.147674 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" Mar 18 12:12:02 crc kubenswrapper[4921]: W0318 12:12:02.148254 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc87ffe_81eb_46ea_b7cd_d98c56df78c9.slice/crio-4097f0d2307d5818c37303ce206cb9634f67a636cb611b1cc324ce03b99610e4 WatchSource:0}: Error finding container 4097f0d2307d5818c37303ce206cb9634f67a636cb611b1cc324ce03b99610e4: Status 404 returned error can't find the container with id 4097f0d2307d5818c37303ce206cb9634f67a636cb611b1cc324ce03b99610e4 Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.156640 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.165317 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.166251 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.180539 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.189511 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.199443 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.200786 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.211826 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.220572 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.240725 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.261195 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.280833 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.302673 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.336984 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pwwdq"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.356594 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgl7t\" (UniqueName: \"kubernetes.io/projected/8c5658a1-dd31-41cc-a12b-653abf972154-kube-api-access-wgl7t\") pod \"etcd-operator-b45778765-xl9c9\" (UID: \"8c5658a1-dd31-41cc-a12b-653abf972154\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.357249 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5k6g\" (UniqueName: \"kubernetes.io/projected/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-kube-api-access-h5k6g\") pod \"console-f9d7485db-8s2wr\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.365397 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.380372 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:02 crc kubenswrapper[4921]: W0318 12:12:02.405540 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd9ef5a_53e4_4ea4_a8cb_b86fc76febdf.slice/crio-36923bb41e18bda6e98722f632d85243f9ef9baeed94b013ba05fb4f3029a981 WatchSource:0}: Error finding container 36923bb41e18bda6e98722f632d85243f9ef9baeed94b013ba05fb4f3029a981: Status 404 returned error can't find the container with id 36923bb41e18bda6e98722f632d85243f9ef9baeed94b013ba05fb4f3029a981 Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.413165 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.466861 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwsl\" (UniqueName: \"kubernetes.io/projected/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-kube-api-access-pkwsl\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.466907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc95r\" (UniqueName: \"kubernetes.io/projected/6e5d2390-ab7e-4012-9930-3578aff33f2f-kube-api-access-bc95r\") pod \"migrator-59844c95c7-dsk24\" (UID: \"6e5d2390-ab7e-4012-9930-3578aff33f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.466926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/714f8a69-365e-4825-bc9f-5344b2d36e21-signing-key\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.466949 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hxp\" (UniqueName: \"kubernetes.io/projected/fb744db6-3732-400d-8939-2577d28e7cd5-kube-api-access-78hxp\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.466970 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qff5x\" (UniqueName: \"kubernetes.io/projected/89da31ea-f077-477f-a9c0-52cc265c53c0-kube-api-access-qff5x\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.466999 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-trusted-ca\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467016 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9073ce4b-8c2f-4f53-8d72-0984d42addf4-metrics-tls\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467030 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467044 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89da31ea-f077-477f-a9c0-52cc265c53c0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467058 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plt6n\" (UniqueName: \"kubernetes.io/projected/1704e345-c317-44b3-89a3-0a7bdd9dd901-kube-api-access-plt6n\") pod \"downloads-7954f5f757-swhkn\" (UID: \"1704e345-c317-44b3-89a3-0a7bdd9dd901\") " pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467073 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxnjt\" (UniqueName: \"kubernetes.io/projected/71184a3d-1ecb-41e7-b7ed-9bc3e20131cb-kube-api-access-sxnjt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qsrjd\" (UID: \"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467089 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-metrics-certs\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsgv2\" (UniqueName: \"kubernetes.io/projected/7205a33d-ffe1-447c-b1db-756842fcfb4d-kube-api-access-qsgv2\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467179 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf67c\" (UniqueName: \"kubernetes.io/projected/714f8a69-365e-4825-bc9f-5344b2d36e21-kube-api-access-qf67c\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467194 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbee3eb4-1971-45b8-b0a5-3819407584ec-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467209 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbee3eb4-1971-45b8-b0a5-3819407584ec-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467225 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-machine-approver-tls\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467238 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-stats-auth\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467278 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467306 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hww65\" (UniqueName: \"kubernetes.io/projected/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-kube-api-access-hww65\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467322 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-config\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467337 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cml8b\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-kube-api-access-cml8b\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-proxy-tls\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467368 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb744db6-3732-400d-8939-2577d28e7cd5-secret-volume\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467391 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8msft\" (UniqueName: \"kubernetes.io/projected/d3f7009d-281a-4de9-8efd-17d35d4097a6-kube-api-access-8msft\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467406 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b00ba244-0295-4065-a77a-92a947e70d4b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467441 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9afb9d3-e743-46d2-9d48-63807cbc84cf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467459 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9afb9d3-e743-46d2-9d48-63807cbc84cf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467474 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69jz5\" (UniqueName: \"kubernetes.io/projected/428b5469-6fe4-4896-b861-05b406ed0ad6-kube-api-access-69jz5\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-default-certificate\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467506 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467520 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/428b5469-6fe4-4896-b861-05b406ed0ad6-tmpfs\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467545 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lngd\" (UniqueName: \"kubernetes.io/projected/9073ce4b-8c2f-4f53-8d72-0984d42addf4-kube-api-access-9lngd\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467560 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b00ba244-0295-4065-a77a-92a947e70d4b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467576 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs45\" (UniqueName: \"kubernetes.io/projected/6f991d20-33c4-457b-919b-736a93226768-kube-api-access-fqs45\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467593 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-auth-proxy-config\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467610 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2827febe-00a0-48cf-8170-be2ec745f4e8-srv-cert\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/71184a3d-1ecb-41e7-b7ed-9bc3e20131cb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qsrjd\" (UID: \"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467665 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467681 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-policies\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467696 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f991d20-33c4-457b-919b-736a93226768-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/428b5469-6fe4-4896-b861-05b406ed0ad6-webhook-cert\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467730 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89da31ea-f077-477f-a9c0-52cc265c53c0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467747 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft444\" (UniqueName: \"kubernetes.io/projected/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-kube-api-access-ft444\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467764 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9afb9d3-e743-46d2-9d48-63807cbc84cf-config\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467778 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f7009d-281a-4de9-8efd-17d35d4097a6-serving-cert\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467799 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb744db6-3732-400d-8939-2577d28e7cd5-config-volume\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467816 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3f7009d-281a-4de9-8efd-17d35d4097a6-trusted-ca\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467832 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467848 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b790835-2b38-4cfe-b490-d7c9638b96ec-metrics-tls\") pod \"dns-operator-744455d44c-vxtr2\" (UID: \"5b790835-2b38-4cfe-b490-d7c9638b96ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467865 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/714f8a69-365e-4825-bc9f-5344b2d36e21-signing-cabundle\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467888 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467904 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-certificates\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467920 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qg9z\" (UniqueName: \"kubernetes.io/projected/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-kube-api-access-4qg9z\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467948 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6d6\" (UniqueName: \"kubernetes.io/projected/200c00bc-1280-4200-aca4-8a798728a787-kube-api-access-tt6d6\") pod \"multus-admission-controller-857f4d67dd-4qdcl\" (UID: \"200c00bc-1280-4200-aca4-8a798728a787\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467973 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.467997 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsnbl\" (UniqueName: \"kubernetes.io/projected/b00ba244-0295-4065-a77a-92a947e70d4b-kube-api-access-wsnbl\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468013 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468029 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-serving-cert\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468050 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9073ce4b-8c2f-4f53-8d72-0984d42addf4-trusted-ca\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468088 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-bound-sa-token\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f7009d-281a-4de9-8efd-17d35d4097a6-config\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468142 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-images\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468174 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb29e5f-85d3-44b2-81f3-61b33e007475-config\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468190 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/428b5469-6fe4-4896-b861-05b406ed0ad6-apiservice-cert\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-config\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468240 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfcl\" (UniqueName: \"kubernetes.io/projected/5b790835-2b38-4cfe-b490-d7c9638b96ec-kube-api-access-qcfcl\") pod \"dns-operator-744455d44c-vxtr2\" (UID: \"5b790835-2b38-4cfe-b490-d7c9638b96ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfhz\" (UniqueName: \"kubernetes.io/projected/2827febe-00a0-48cf-8170-be2ec745f4e8-kube-api-access-kgfhz\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468290 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468308 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ef51394-c324-416d-911b-170f10288c76-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468332 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcfht\" (UniqueName: \"kubernetes.io/projected/eac88057-b7cd-4264-861c-b7d53340338d-kube-api-access-lcfht\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468349 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468364 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-service-ca-bundle\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468379 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f991d20-33c4-457b-919b-736a93226768-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468394 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468408 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcb29e5f-85d3-44b2-81f3-61b33e007475-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468424 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdpwx\" (UniqueName: \"kubernetes.io/projected/5ef51394-c324-416d-911b-170f10288c76-kube-api-access-bdpwx\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468439 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-tls\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468454 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2827febe-00a0-48cf-8170-be2ec745f4e8-profile-collector-cert\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468471 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-dir\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468503 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468521 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b00ba244-0295-4065-a77a-92a947e70d4b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468610 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ef51394-c324-416d-911b-170f10288c76-proxy-tls\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468696 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9073ce4b-8c2f-4f53-8d72-0984d42addf4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468766 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/200c00bc-1280-4200-aca4-8a798728a787-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4qdcl\" (UID: \"200c00bc-1280-4200-aca4-8a798728a787\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468795 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468838 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb29e5f-85d3-44b2-81f3-61b33e007475-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468862 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8c868e-8218-4f20-a932-0285a5676da7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g8jgv\" (UID: \"ac8c868e-8218-4f20-a932-0285a5676da7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.468891 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgsn\" (UniqueName: \"kubernetes.io/projected/ac8c868e-8218-4f20-a932-0285a5676da7-kube-api-access-rwgsn\") pod \"package-server-manager-789f6589d5-g8jgv\" (UID: \"ac8c868e-8218-4f20-a932-0285a5676da7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.477580 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:02.977559549 +0000 UTC m=+142.527480188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.570865 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.571088 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.071060745 +0000 UTC m=+142.620981384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572033 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9afb9d3-e743-46d2-9d48-63807cbc84cf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572090 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2hs\" (UniqueName: \"kubernetes.io/projected/11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a-kube-api-access-mr2hs\") pod \"ingress-canary-p2tjp\" (UID: \"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a\") " pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572143 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9afb9d3-e743-46d2-9d48-63807cbc84cf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572347 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80fb60fe-c806-49ed-9f66-b0c0807cb40a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572535 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jz5\" (UniqueName: \"kubernetes.io/projected/428b5469-6fe4-4896-b861-05b406ed0ad6-kube-api-access-69jz5\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-default-certificate\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572582 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572616 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lngd\" (UniqueName: \"kubernetes.io/projected/9073ce4b-8c2f-4f53-8d72-0984d42addf4-kube-api-access-9lngd\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572632 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/428b5469-6fe4-4896-b861-05b406ed0ad6-tmpfs\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572653 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b00ba244-0295-4065-a77a-92a947e70d4b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572671 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqs45\" (UniqueName: \"kubernetes.io/projected/6f991d20-33c4-457b-919b-736a93226768-kube-api-access-fqs45\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572690 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb4br\" (UniqueName: \"kubernetes.io/projected/3e4be723-81e1-4c74-a380-3ccd634a2f39-kube-api-access-hb4br\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572708 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-auth-proxy-config\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572727 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572749 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2827febe-00a0-48cf-8170-be2ec745f4e8-srv-cert\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572767 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/71184a3d-1ecb-41e7-b7ed-9bc3e20131cb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qsrjd\" (UID: \"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572783 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572799 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-policies\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572816 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f991d20-33c4-457b-919b-736a93226768-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572844 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/428b5469-6fe4-4896-b861-05b406ed0ad6-webhook-cert\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8lw\" (UniqueName: \"kubernetes.io/projected/cff38cba-2769-48cf-98e2-4946ab75a1d7-kube-api-access-cb8lw\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572890 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89da31ea-f077-477f-a9c0-52cc265c53c0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572907 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9afb9d3-e743-46d2-9d48-63807cbc84cf-config\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572922 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f7009d-281a-4de9-8efd-17d35d4097a6-serving-cert\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb744db6-3732-400d-8939-2577d28e7cd5-config-volume\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572954 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft444\" (UniqueName: \"kubernetes.io/projected/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-kube-api-access-ft444\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.572977 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv58d\" (UniqueName: \"kubernetes.io/projected/b92aecfc-3e54-4724-9766-45db6ce62f8a-kube-api-access-jv58d\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3f7009d-281a-4de9-8efd-17d35d4097a6-trusted-ca\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573025 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e4be723-81e1-4c74-a380-3ccd634a2f39-ready\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573060 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b790835-2b38-4cfe-b490-d7c9638b96ec-metrics-tls\") pod \"dns-operator-744455d44c-vxtr2\" (UID: \"5b790835-2b38-4cfe-b490-d7c9638b96ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573076 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/714f8a69-365e-4825-bc9f-5344b2d36e21-signing-cabundle\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573094 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573142 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-certificates\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qg9z\" (UniqueName: \"kubernetes.io/projected/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-kube-api-access-4qg9z\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573178 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hrg\" (UniqueName: \"kubernetes.io/projected/300a483a-7841-4135-ad9f-3ff45b6cef74-kube-api-access-p5hrg\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573194 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b92aecfc-3e54-4724-9766-45db6ce62f8a-node-bootstrap-token\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6d6\" (UniqueName: \"kubernetes.io/projected/200c00bc-1280-4200-aca4-8a798728a787-kube-api-access-tt6d6\") pod \"multus-admission-controller-857f4d67dd-4qdcl\" (UID: \"200c00bc-1280-4200-aca4-8a798728a787\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573235 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573259 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsnbl\" (UniqueName: \"kubernetes.io/projected/b00ba244-0295-4065-a77a-92a947e70d4b-kube-api-access-wsnbl\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-serving-cert\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b92aecfc-3e54-4724-9766-45db6ce62f8a-certs\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573338 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573355 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sb64\" (UniqueName: \"kubernetes.io/projected/80fb60fe-c806-49ed-9f66-b0c0807cb40a-kube-api-access-4sb64\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573379 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9073ce4b-8c2f-4f53-8d72-0984d42addf4-trusted-ca\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573394 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-mountpoint-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-bound-sa-token\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573432 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f7009d-281a-4de9-8efd-17d35d4097a6-config\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573451 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggszn\" (UniqueName: \"kubernetes.io/projected/58d7472e-2ed9-434f-b1f0-6147f9452a11-kube-api-access-ggszn\") pod \"auto-csr-approver-29563932-b8gp7\" (UID: \"58d7472e-2ed9-434f-b1f0-6147f9452a11\") " pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573470 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-images\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb29e5f-85d3-44b2-81f3-61b33e007475-config\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573505 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/428b5469-6fe4-4896-b861-05b406ed0ad6-apiservice-cert\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573544 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-config\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573560 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfhz\" (UniqueName: \"kubernetes.io/projected/2827febe-00a0-48cf-8170-be2ec745f4e8-kube-api-access-kgfhz\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573575 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfcl\" (UniqueName: \"kubernetes.io/projected/5b790835-2b38-4cfe-b490-d7c9638b96ec-kube-api-access-qcfcl\") pod \"dns-operator-744455d44c-vxtr2\" (UID: \"5b790835-2b38-4cfe-b490-d7c9638b96ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573598 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573615 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cff38cba-2769-48cf-98e2-4946ab75a1d7-metrics-tls\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573632 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ef51394-c324-416d-911b-170f10288c76-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573651 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcfht\" (UniqueName: \"kubernetes.io/projected/eac88057-b7cd-4264-861c-b7d53340338d-kube-api-access-lcfht\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573667 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573683 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-service-ca-bundle\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573709 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f991d20-33c4-457b-919b-736a93226768-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573725 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcb29e5f-85d3-44b2-81f3-61b33e007475-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573760 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-tls\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573777 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2827febe-00a0-48cf-8170-be2ec745f4e8-profile-collector-cert\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdpwx\" (UniqueName: \"kubernetes.io/projected/5ef51394-c324-416d-911b-170f10288c76-kube-api-access-bdpwx\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573807 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-socket-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573822 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff38cba-2769-48cf-98e2-4946ab75a1d7-config-volume\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573838 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e4be723-81e1-4c74-a380-3ccd634a2f39-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573856 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-dir\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573882 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573897 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573913 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a-cert\") pod \"ingress-canary-p2tjp\" (UID: \"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a\") " pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573927 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80fb60fe-c806-49ed-9f66-b0c0807cb40a-srv-cert\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573948 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b00ba244-0295-4065-a77a-92a947e70d4b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.573991 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e4be723-81e1-4c74-a380-3ccd634a2f39-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ef51394-c324-416d-911b-170f10288c76-proxy-tls\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574037 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9073ce4b-8c2f-4f53-8d72-0984d42addf4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574054 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb29e5f-85d3-44b2-81f3-61b33e007475-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574071 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8c868e-8218-4f20-a932-0285a5676da7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g8jgv\" (UID: \"ac8c868e-8218-4f20-a932-0285a5676da7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574088 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgsn\" (UniqueName: \"kubernetes.io/projected/ac8c868e-8218-4f20-a932-0285a5676da7-kube-api-access-rwgsn\") pod \"package-server-manager-789f6589d5-g8jgv\" (UID: \"ac8c868e-8218-4f20-a932-0285a5676da7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574103 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/200c00bc-1280-4200-aca4-8a798728a787-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4qdcl\" (UID: \"200c00bc-1280-4200-aca4-8a798728a787\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574136 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574165 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwsl\" (UniqueName: \"kubernetes.io/projected/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-kube-api-access-pkwsl\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc95r\" (UniqueName: \"kubernetes.io/projected/6e5d2390-ab7e-4012-9930-3578aff33f2f-kube-api-access-bc95r\") pod \"migrator-59844c95c7-dsk24\" (UID: \"6e5d2390-ab7e-4012-9930-3578aff33f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574205 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/714f8a69-365e-4825-bc9f-5344b2d36e21-signing-key\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/428b5469-6fe4-4896-b861-05b406ed0ad6-tmpfs\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574231 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-plugins-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574247 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hxp\" (UniqueName: \"kubernetes.io/projected/fb744db6-3732-400d-8939-2577d28e7cd5-kube-api-access-78hxp\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574266 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qff5x\" (UniqueName: \"kubernetes.io/projected/89da31ea-f077-477f-a9c0-52cc265c53c0-kube-api-access-qff5x\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574284 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-trusted-ca\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9073ce4b-8c2f-4f53-8d72-0984d42addf4-metrics-tls\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574328 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574344 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89da31ea-f077-477f-a9c0-52cc265c53c0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574363 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxnjt\" (UniqueName: \"kubernetes.io/projected/71184a3d-1ecb-41e7-b7ed-9bc3e20131cb-kube-api-access-sxnjt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qsrjd\" (UID: \"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plt6n\" (UniqueName: \"kubernetes.io/projected/1704e345-c317-44b3-89a3-0a7bdd9dd901-kube-api-access-plt6n\") pod \"downloads-7954f5f757-swhkn\" (UID: \"1704e345-c317-44b3-89a3-0a7bdd9dd901\") " pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574396 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-registration-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsgv2\" (UniqueName: \"kubernetes.io/projected/7205a33d-ffe1-447c-b1db-756842fcfb4d-kube-api-access-qsgv2\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574432 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-metrics-certs\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf67c\" (UniqueName: \"kubernetes.io/projected/714f8a69-365e-4825-bc9f-5344b2d36e21-kube-api-access-qf67c\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbee3eb4-1971-45b8-b0a5-3819407584ec-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574491 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbee3eb4-1971-45b8-b0a5-3819407584ec-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574506 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-machine-approver-tls\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574521 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-stats-auth\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574548 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574563 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-csi-data-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574596 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cml8b\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-kube-api-access-cml8b\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574613 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-proxy-tls\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574629 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hww65\" (UniqueName: \"kubernetes.io/projected/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-kube-api-access-hww65\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574645 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-config\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574661 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb744db6-3732-400d-8939-2577d28e7cd5-secret-volume\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574679 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8msft\" (UniqueName: \"kubernetes.io/projected/d3f7009d-281a-4de9-8efd-17d35d4097a6-kube-api-access-8msft\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b00ba244-0295-4065-a77a-92a947e70d4b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb744db6-3732-400d-8939-2577d28e7cd5-config-volume\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.574995 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.575326 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.575576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-policies\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.576190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-dir\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.576386 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f991d20-33c4-457b-919b-736a93226768-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.577293 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.578497 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3f7009d-281a-4de9-8efd-17d35d4097a6-trusted-ca\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.579021 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.579088 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-auth-proxy-config\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.579363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-trusted-ca\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.580023 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f7009d-281a-4de9-8efd-17d35d4097a6-config\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.581444 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/714f8a69-365e-4825-bc9f-5344b2d36e21-signing-cabundle\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.583750 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-certificates\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.583988 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b00ba244-0295-4065-a77a-92a947e70d4b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.586989 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ef51394-c324-416d-911b-170f10288c76-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.587462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-images\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.587853 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9afb9d3-e743-46d2-9d48-63807cbc84cf-config\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.587931 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb29e5f-85d3-44b2-81f3-61b33e007475-config\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.588433 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.088419003 +0000 UTC m=+142.638339642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.589497 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9073ce4b-8c2f-4f53-8d72-0984d42addf4-trusted-ca\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.593793 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ef51394-c324-416d-911b-170f10288c76-proxy-tls\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.594001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-config\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.594048 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.595447 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.596013 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89da31ea-f077-477f-a9c0-52cc265c53c0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.596221 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zt5mh"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.596905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-service-ca-bundle\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.597327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.597796 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbee3eb4-1971-45b8-b0a5-3819407584ec-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.598494 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-config\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.608460 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-stats-auth\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.608629 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.610652 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb744db6-3732-400d-8939-2577d28e7cd5-secret-volume\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.612633 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89da31ea-f077-477f-a9c0-52cc265c53c0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.614833 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-machine-approver-tls\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.616591 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9afb9d3-e743-46d2-9d48-63807cbc84cf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.617972 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9073ce4b-8c2f-4f53-8d72-0984d42addf4-metrics-tls\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618668 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb29e5f-85d3-44b2-81f3-61b33e007475-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618686 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b790835-2b38-4cfe-b490-d7c9638b96ec-metrics-tls\") pod \"dns-operator-744455d44c-vxtr2\" (UID: \"5b790835-2b38-4cfe-b490-d7c9638b96ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618717 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/428b5469-6fe4-4896-b861-05b406ed0ad6-webhook-cert\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618771 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618886 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618980 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b00ba244-0295-4065-a77a-92a947e70d4b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.618978 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3f7009d-281a-4de9-8efd-17d35d4097a6-serving-cert\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619043 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2827febe-00a0-48cf-8170-be2ec745f4e8-profile-collector-cert\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619201 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619414 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-default-certificate\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619456 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/71184a3d-1ecb-41e7-b7ed-9bc3e20131cb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qsrjd\" (UID: \"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619466 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/714f8a69-365e-4825-bc9f-5344b2d36e21-signing-key\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619466 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2827febe-00a0-48cf-8170-be2ec745f4e8-srv-cert\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619637 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-proxy-tls\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.619677 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.620804 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8c868e-8218-4f20-a932-0285a5676da7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g8jgv\" (UID: \"ac8c868e-8218-4f20-a932-0285a5676da7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.621217 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.621096 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.621382 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.621814 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-tls\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.622456 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/428b5469-6fe4-4896-b861-05b406ed0ad6-apiservice-cert\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.622937 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-metrics-certs\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.623050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f991d20-33c4-457b-919b-736a93226768-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.623423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/200c00bc-1280-4200-aca4-8a798728a787-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4qdcl\" (UID: \"200c00bc-1280-4200-aca4-8a798728a787\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.623509 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.623604 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbee3eb4-1971-45b8-b0a5-3819407584ec-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.624346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.624888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-serving-cert\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.626679 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9afb9d3-e743-46d2-9d48-63807cbc84cf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v7tn2\" (UID: \"d9afb9d3-e743-46d2-9d48-63807cbc84cf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.639253 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b00ba244-0295-4065-a77a-92a947e70d4b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:02 crc kubenswrapper[4921]: W0318 12:12:02.655468 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3888977f_a875_4d0b_85c1_fb3033156ef7.slice/crio-c3db575d7a52acb44b79526ccb32921a98eeee35810b8f9b78abcd248b720cf4 WatchSource:0}: Error finding container c3db575d7a52acb44b79526ccb32921a98eeee35810b8f9b78abcd248b720cf4: Status 404 returned error can't find the container with id c3db575d7a52acb44b79526ccb32921a98eeee35810b8f9b78abcd248b720cf4 Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.662421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jz5\" (UniqueName: \"kubernetes.io/projected/428b5469-6fe4-4896-b861-05b406ed0ad6-kube-api-access-69jz5\") pod \"packageserver-d55dfcdfc-9g4bd\" (UID: \"428b5469-6fe4-4896-b861-05b406ed0ad6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678187 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-plugins-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.678489 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.178461283 +0000 UTC m=+142.728381962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678589 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-registration-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678659 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-csi-data-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-plugins-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2hs\" (UniqueName: \"kubernetes.io/projected/11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a-kube-api-access-mr2hs\") pod \"ingress-canary-p2tjp\" (UID: \"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a\") " pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678770 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80fb60fe-c806-49ed-9f66-b0c0807cb40a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678857 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb4br\" (UniqueName: \"kubernetes.io/projected/3e4be723-81e1-4c74-a380-3ccd634a2f39-kube-api-access-hb4br\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-csi-data-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678900 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8lw\" (UniqueName: \"kubernetes.io/projected/cff38cba-2769-48cf-98e2-4946ab75a1d7-kube-api-access-cb8lw\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678913 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-registration-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678941 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv58d\" (UniqueName: \"kubernetes.io/projected/b92aecfc-3e54-4724-9766-45db6ce62f8a-kube-api-access-jv58d\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e4be723-81e1-4c74-a380-3ccd634a2f39-ready\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.678985 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b92aecfc-3e54-4724-9766-45db6ce62f8a-node-bootstrap-token\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679018 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hrg\" (UniqueName: \"kubernetes.io/projected/300a483a-7841-4135-ad9f-3ff45b6cef74-kube-api-access-p5hrg\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679064 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b92aecfc-3e54-4724-9766-45db6ce62f8a-certs\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679094 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679138 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sb64\" (UniqueName: \"kubernetes.io/projected/80fb60fe-c806-49ed-9f66-b0c0807cb40a-kube-api-access-4sb64\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679160 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-mountpoint-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679203 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggszn\" (UniqueName: \"kubernetes.io/projected/58d7472e-2ed9-434f-b1f0-6147f9452a11-kube-api-access-ggszn\") pod \"auto-csr-approver-29563932-b8gp7\" (UID: \"58d7472e-2ed9-434f-b1f0-6147f9452a11\") " pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679259 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cff38cba-2769-48cf-98e2-4946ab75a1d7-metrics-tls\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679303 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-socket-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff38cba-2769-48cf-98e2-4946ab75a1d7-config-volume\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e4be723-81e1-4c74-a380-3ccd634a2f39-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a-cert\") pod \"ingress-canary-p2tjp\" (UID: \"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a\") " pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679406 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80fb60fe-c806-49ed-9f66-b0c0807cb40a-srv-cert\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.679430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e4be723-81e1-4c74-a380-3ccd634a2f39-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.679827 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.17980644 +0000 UTC m=+142.729727179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.680090 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e4be723-81e1-4c74-a380-3ccd634a2f39-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.680258 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-mountpoint-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.680536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e4be723-81e1-4c74-a380-3ccd634a2f39-ready\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.680608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e4be723-81e1-4c74-a380-3ccd634a2f39-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.680665 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/300a483a-7841-4135-ad9f-3ff45b6cef74-socket-dir\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.682566 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80fb60fe-c806-49ed-9f66-b0c0807cb40a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.685423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b92aecfc-3e54-4724-9766-45db6ce62f8a-certs\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.685647 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqs45\" (UniqueName: \"kubernetes.io/projected/6f991d20-33c4-457b-919b-736a93226768-kube-api-access-fqs45\") pod \"openshift-controller-manager-operator-756b6f6bc6-hck2l\" (UID: \"6f991d20-33c4-457b-919b-736a93226768\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.688762 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a-cert\") pod \"ingress-canary-p2tjp\" (UID: \"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a\") " pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.689262 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cff38cba-2769-48cf-98e2-4946ab75a1d7-config-volume\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.690377 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b92aecfc-3e54-4724-9766-45db6ce62f8a-node-bootstrap-token\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.694451 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.701280 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cff38cba-2769-48cf-98e2-4946ab75a1d7-metrics-tls\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.703609 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.705028 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z5bqz"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.707312 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80fb60fe-c806-49ed-9f66-b0c0807cb40a-srv-cert\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.710936 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gfrt5"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.712912 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8s2wr"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.713931 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lngd\" (UniqueName: \"kubernetes.io/projected/9073ce4b-8c2f-4f53-8d72-0984d42addf4-kube-api-access-9lngd\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.721695 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft444\" (UniqueName: \"kubernetes.io/projected/356738b5-d7cc-4ce5-9bd8-1a45bf7630a5-kube-api-access-ft444\") pod \"router-default-5444994796-52p2z\" (UID: \"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5\") " pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.721988 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.752517 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgsn\" (UniqueName: \"kubernetes.io/projected/ac8c868e-8218-4f20-a932-0285a5676da7-kube-api-access-rwgsn\") pod \"package-server-manager-789f6589d5-g8jgv\" (UID: \"ac8c868e-8218-4f20-a932-0285a5676da7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.759862 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qg9z\" (UniqueName: \"kubernetes.io/projected/0b6df968-4ab2-42c8-afd4-c3b5ecea8dad-kube-api-access-4qg9z\") pod \"machine-approver-56656f9798-tvvpg\" (UID: \"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.769363 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.775700 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6d6\" (UniqueName: \"kubernetes.io/projected/200c00bc-1280-4200-aca4-8a798728a787-kube-api-access-tt6d6\") pod \"multus-admission-controller-857f4d67dd-4qdcl\" (UID: \"200c00bc-1280-4200-aca4-8a798728a787\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.781522 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.782088 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.282071236 +0000 UTC m=+142.831991865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.797716 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xl9c9"] Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.809511 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-bound-sa-token\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.825288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9073ce4b-8c2f-4f53-8d72-0984d42addf4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5n9xl\" (UID: \"9073ce4b-8c2f-4f53-8d72-0984d42addf4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.854705 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.859415 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwsl\" (UniqueName: \"kubernetes.io/projected/80efc6d8-059c-47a2-90d4-0f3e0a31ac4a-kube-api-access-pkwsl\") pod \"service-ca-operator-777779d784-dk62n\" (UID: \"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.868755 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.877136 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.884875 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hww65\" (UniqueName: \"kubernetes.io/projected/c33a3955-f7a2-4d3b-9cf1-83872a2d4670-kube-api-access-hww65\") pod \"machine-config-operator-74547568cd-jlph4\" (UID: \"c33a3955-f7a2-4d3b-9cf1-83872a2d4670\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.888080 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.890741 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.390713539 +0000 UTC m=+142.940634178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.890746 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc95r\" (UniqueName: \"kubernetes.io/projected/6e5d2390-ab7e-4012-9930-3578aff33f2f-kube-api-access-bc95r\") pod \"migrator-59844c95c7-dsk24\" (UID: \"6e5d2390-ab7e-4012-9930-3578aff33f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.893821 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.899943 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" event={"ID":"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf","Type":"ContainerStarted","Data":"b0a22434e737509a67e3a8cf613166d56503e8b0799f15c15be9fad40cd30343"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.899993 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" event={"ID":"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf","Type":"ContainerStarted","Data":"36923bb41e18bda6e98722f632d85243f9ef9baeed94b013ba05fb4f3029a981"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.900385 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.911400 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" event={"ID":"b98da0c2-cddf-4701-8703-6821fe2bb520","Type":"ContainerStarted","Data":"bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.911853 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" event={"ID":"b98da0c2-cddf-4701-8703-6821fe2bb520","Type":"ContainerStarted","Data":"d2693bee1574ea26c06e3a11e7b4fef0cf4d0f221a0f12eda5498e8b0658e706"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.911722 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hxp\" (UniqueName: \"kubernetes.io/projected/fb744db6-3732-400d-8939-2577d28e7cd5-kube-api-access-78hxp\") pod \"collect-profiles-29563920-2bn6n\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.912621 4921 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pwwdq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.912674 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" podUID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.912800 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.916712 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" event={"ID":"8c5658a1-dd31-41cc-a12b-653abf972154","Type":"ContainerStarted","Data":"8420b4d8b50fbe82d370973e618423e41feeff588d19913c2677836607d0e7a1"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.917362 4921 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qgx2c container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.917394 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" podUID="b98da0c2-cddf-4701-8703-6821fe2bb520" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.918205 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" event={"ID":"43a2af0e-c364-40a7-b654-966a74211add","Type":"ContainerStarted","Data":"1959e8a2a48b27a5ccd6eeaea74000fdd2b7a5b511cda895b19ace1f5db0d1e7"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.918299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qff5x\" (UniqueName: \"kubernetes.io/projected/89da31ea-f077-477f-a9c0-52cc265c53c0-kube-api-access-qff5x\") pod \"kube-storage-version-migrator-operator-b67b599dd-jj9nh\" (UID: \"89da31ea-f077-477f-a9c0-52cc265c53c0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.922097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" event={"ID":"3888977f-a875-4d0b-85c1-fb3033156ef7","Type":"ContainerStarted","Data":"e4165a363f23e86bf1eb5bd2a2df423bf771a397c6c18d909372be9546cf0245"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.922300 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" event={"ID":"3888977f-a875-4d0b-85c1-fb3033156ef7","Type":"ContainerStarted","Data":"c3db575d7a52acb44b79526ccb32921a98eeee35810b8f9b78abcd248b720cf4"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.931780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-52p2z" event={"ID":"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5","Type":"ContainerStarted","Data":"674bc497796f6de890168ebb6ee9eab02a6d4ab8543ca98685da0f849e196160"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.963695 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8msft\" (UniqueName: \"kubernetes.io/projected/d3f7009d-281a-4de9-8efd-17d35d4097a6-kube-api-access-8msft\") pod \"console-operator-58897d9998-9qfns\" (UID: \"d3f7009d-281a-4de9-8efd-17d35d4097a6\") " pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.983858 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.986191 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" event={"ID":"b86f3c6a-1db9-44c8-911c-46647c933bd7","Type":"ContainerStarted","Data":"6f583f7866d149910bc145fe5ac0e831ae85d083547bee288fdfd3a10c9274d3"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.988383 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66052ca8-49ea-4ab8-9b3a-eaffd9feda3f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-69gx7\" (UID: \"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.990931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" event={"ID":"d4f012fd-823f-400e-9af8-9a2264f18589","Type":"ContainerStarted","Data":"b85716cf9aaa92a76d1a6324d610de5e197c55082d128f5a9846a53b26257807"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.992275 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.992483 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.492460382 +0000 UTC m=+143.042381021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.992720 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:02 crc kubenswrapper[4921]: E0318 12:12:02.993199 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.493182012 +0000 UTC m=+143.043102651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.996225 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" event={"ID":"650c2009-2949-4ba8-ad59-8188c8a9523b","Type":"ContainerStarted","Data":"0b035daae97e1430ffadb709f3f43542ef98238afd12b043bcd72d4e2cac70ef"} Mar 18 12:12:02 crc kubenswrapper[4921]: I0318 12:12:02.996271 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" event={"ID":"650c2009-2949-4ba8-ad59-8188c8a9523b","Type":"ContainerStarted","Data":"a0f2d2a738af0c168925907c88c31c72bc577fcd4163c795ade0ffcce24cb903"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.001664 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.005707 4921 generic.go:334] "Generic (PLEG): container finished" podID="cbc87ffe-81eb-46ea-b7cd-d98c56df78c9" containerID="94361aee237bffb4cab7f5989962132ad500bac94d99ab7370d5b8f8868c99f6" exitCode=0 Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.005787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" event={"ID":"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9","Type":"ContainerDied","Data":"94361aee237bffb4cab7f5989962132ad500bac94d99ab7370d5b8f8868c99f6"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.005853 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" event={"ID":"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9","Type":"ContainerStarted","Data":"4097f0d2307d5818c37303ce206cb9634f67a636cb611b1cc324ce03b99610e4"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.018668 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxnjt\" (UniqueName: \"kubernetes.io/projected/71184a3d-1ecb-41e7-b7ed-9bc3e20131cb-kube-api-access-sxnjt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qsrjd\" (UID: \"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.020038 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsgv2\" (UniqueName: \"kubernetes.io/projected/7205a33d-ffe1-447c-b1db-756842fcfb4d-kube-api-access-qsgv2\") pod \"oauth-openshift-558db77b4-kz82p\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.039544 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8s2wr" event={"ID":"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2","Type":"ContainerStarted","Data":"dc30c8e9ae50031f83fd3b36a8c20907fd3cbf2362e2de4d20e9116dbb43062d"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.044653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfcl\" (UniqueName: \"kubernetes.io/projected/5b790835-2b38-4cfe-b490-d7c9638b96ec-kube-api-access-qcfcl\") pod \"dns-operator-744455d44c-vxtr2\" (UID: \"5b790835-2b38-4cfe-b490-d7c9638b96ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.051837 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.055780 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.059708 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g8926" event={"ID":"b6113b50-f73b-4839-ae08-2bc7d4abb024","Type":"ContainerStarted","Data":"8448681fc17143876e373287b35c1f09802a642689cc25f4be1bed46dd890782"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.059784 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g8926" event={"ID":"b6113b50-f73b-4839-ae08-2bc7d4abb024","Type":"ContainerStarted","Data":"2cbf8934c92e4b7bbff17c82f55d0969a3878c8cdb1127559b0940b1d36864a1"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.063501 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" event={"ID":"c8f1ed20-40fc-4010-be16-03815fce6b82","Type":"ContainerStarted","Data":"9ad2ff7bc7cc6c70cd60793027e2fd0662a93e37a2b1e6f0e02402f7aba772a9"} Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.069163 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcfht\" (UniqueName: \"kubernetes.io/projected/eac88057-b7cd-4264-861c-b7d53340338d-kube-api-access-lcfht\") pod \"marketplace-operator-79b997595-5j4lm\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.078663 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cml8b\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-kube-api-access-cml8b\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.090206 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.095225 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.095614 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.595593003 +0000 UTC m=+143.145513652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.095682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.102751 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.103696 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.603663745 +0000 UTC m=+143.153584384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.106167 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsnbl\" (UniqueName: \"kubernetes.io/projected/b00ba244-0295-4065-a77a-92a947e70d4b-kube-api-access-wsnbl\") pod \"cluster-image-registry-operator-dc59b4c8b-2trf4\" (UID: \"b00ba244-0295-4065-a77a-92a947e70d4b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.117692 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.120496 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.129659 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.135016 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plt6n\" (UniqueName: \"kubernetes.io/projected/1704e345-c317-44b3-89a3-0a7bdd9dd901-kube-api-access-plt6n\") pod \"downloads-7954f5f757-swhkn\" (UID: \"1704e345-c317-44b3-89a3-0a7bdd9dd901\") " pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.135607 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.141983 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdpwx\" (UniqueName: \"kubernetes.io/projected/5ef51394-c324-416d-911b-170f10288c76-kube-api-access-bdpwx\") pod \"machine-config-controller-84d6567774-7j6jq\" (UID: \"5ef51394-c324-416d-911b-170f10288c76\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:03 crc kubenswrapper[4921]: W0318 12:12:03.151550 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b6df968_4ab2_42c8_afd4_c3b5ecea8dad.slice/crio-6368482f0713a600d0bc9e835408e9d6abb50f25b76a1413188d3990b1001d4e WatchSource:0}: Error finding container 6368482f0713a600d0bc9e835408e9d6abb50f25b76a1413188d3990b1001d4e: Status 404 returned error can't find the container with id 6368482f0713a600d0bc9e835408e9d6abb50f25b76a1413188d3990b1001d4e Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.153756 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.163169 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf67c\" (UniqueName: \"kubernetes.io/projected/714f8a69-365e-4825-bc9f-5344b2d36e21-kube-api-access-qf67c\") pod \"service-ca-9c57cc56f-qcvdl\" (UID: \"714f8a69-365e-4825-bc9f-5344b2d36e21\") " pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.173456 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.180065 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcb29e5f-85d3-44b2-81f3-61b33e007475-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4z68x\" (UID: \"dcb29e5f-85d3-44b2-81f3-61b33e007475\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.188642 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.196799 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.197072 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.697050848 +0000 UTC m=+143.246971497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.197226 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.198564 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.698549499 +0000 UTC m=+143.248470138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.201177 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4qdcl"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.203096 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfhz\" (UniqueName: \"kubernetes.io/projected/2827febe-00a0-48cf-8170-be2ec745f4e8-kube-api-access-kgfhz\") pod \"catalog-operator-68c6474976-tcb2z\" (UID: \"2827febe-00a0-48cf-8170-be2ec745f4e8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.231234 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2hs\" (UniqueName: \"kubernetes.io/projected/11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a-kube-api-access-mr2hs\") pod \"ingress-canary-p2tjp\" (UID: \"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a\") " pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.240271 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb4br\" (UniqueName: \"kubernetes.io/projected/3e4be723-81e1-4c74-a380-3ccd634a2f39-kube-api-access-hb4br\") pod \"cni-sysctl-allowlist-ds-5pf85\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.274173 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.274722 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p2tjp" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.280332 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sb64\" (UniqueName: \"kubernetes.io/projected/80fb60fe-c806-49ed-9f66-b0c0807cb40a-kube-api-access-4sb64\") pod \"olm-operator-6b444d44fb-kmct7\" (UID: \"80fb60fe-c806-49ed-9f66-b0c0807cb40a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.281124 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8lw\" (UniqueName: \"kubernetes.io/projected/cff38cba-2769-48cf-98e2-4946ab75a1d7-kube-api-access-cb8lw\") pod \"dns-default-747dt\" (UID: \"cff38cba-2769-48cf-98e2-4946ab75a1d7\") " pod="openshift-dns/dns-default-747dt" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.300988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.301308 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.801293509 +0000 UTC m=+143.351214148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.304268 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv58d\" (UniqueName: \"kubernetes.io/projected/b92aecfc-3e54-4724-9766-45db6ce62f8a-kube-api-access-jv58d\") pod \"machine-config-server-rrdjn\" (UID: \"b92aecfc-3e54-4724-9766-45db6ce62f8a\") " pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.318099 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggszn\" (UniqueName: \"kubernetes.io/projected/58d7472e-2ed9-434f-b1f0-6147f9452a11-kube-api-access-ggszn\") pod \"auto-csr-approver-29563932-b8gp7\" (UID: \"58d7472e-2ed9-434f-b1f0-6147f9452a11\") " pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.334376 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.339426 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.355567 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hrg\" (UniqueName: \"kubernetes.io/projected/300a483a-7841-4135-ad9f-3ff45b6cef74-kube-api-access-p5hrg\") pod \"csi-hostpathplugin-dkdbz\" (UID: \"300a483a-7841-4135-ad9f-3ff45b6cef74\") " pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.383795 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.395628 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.406033 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.406543 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:03.906527018 +0000 UTC m=+143.456447657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.417273 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.453684 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.460658 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.503149 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.507397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.507545 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.00752432 +0000 UTC m=+143.557444959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.510938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.511323 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.011308044 +0000 UTC m=+143.561228683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.523149 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.534404 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.538916 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rrdjn" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.543702 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.553131 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.579359 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-747dt" Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.584053 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dk62n"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.610076 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv"] Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.616012 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.616188 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.116171173 +0000 UTC m=+143.666091812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.616407 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.616726 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.116718468 +0000 UTC m=+143.666639107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.717262 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.717427 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.217400872 +0000 UTC m=+143.767321511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.717516 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.717872 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.217858694 +0000 UTC m=+143.767779333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: W0318 12:12:03.718710 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428b5469_6fe4_4896_b861_05b406ed0ad6.slice/crio-6a4c26d51e816e224cd66ca5c727e9faddd80c80c59dd30bc6725a27ce9d8554 WatchSource:0}: Error finding container 6a4c26d51e816e224cd66ca5c727e9faddd80c80c59dd30bc6725a27ce9d8554: Status 404 returned error can't find the container with id 6a4c26d51e816e224cd66ca5c727e9faddd80c80c59dd30bc6725a27ce9d8554 Mar 18 12:12:03 crc kubenswrapper[4921]: W0318 12:12:03.758577 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8c868e_8218_4f20_a932_0285a5676da7.slice/crio-7916f84434fe818027164cbd1066c26172b8685fe46212c0fe5e1b198d95ccfa WatchSource:0}: Error finding container 7916f84434fe818027164cbd1066c26172b8685fe46212c0fe5e1b198d95ccfa: Status 404 returned error can't find the container with id 7916f84434fe818027164cbd1066c26172b8685fe46212c0fe5e1b198d95ccfa Mar 18 12:12:03 crc kubenswrapper[4921]: W0318 12:12:03.775679 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80efc6d8_059c_47a2_90d4_0f3e0a31ac4a.slice/crio-fb3611e7f9479eeb665a4f844505c4743b2f2351136d5d32d59364ad908be7f1 WatchSource:0}: Error finding container fb3611e7f9479eeb665a4f844505c4743b2f2351136d5d32d59364ad908be7f1: Status 404 returned error can't find the container with id fb3611e7f9479eeb665a4f844505c4743b2f2351136d5d32d59364ad908be7f1 Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.818558 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.819100 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.319061112 +0000 UTC m=+143.868981751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.819255 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.819645 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.319630408 +0000 UTC m=+143.869551047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: W0318 12:12:03.888233 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e4be723_81e1_4c74_a380_3ccd634a2f39.slice/crio-41e3dcb0a6c0ea970ec9a443607fd2e3688b37c7d3a685244d9f111ed5963329 WatchSource:0}: Error finding container 41e3dcb0a6c0ea970ec9a443607fd2e3688b37c7d3a685244d9f111ed5963329: Status 404 returned error can't find the container with id 41e3dcb0a6c0ea970ec9a443607fd2e3688b37c7d3a685244d9f111ed5963329 Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.920659 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.920767 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.420748683 +0000 UTC m=+143.970669322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.920937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:03 crc kubenswrapper[4921]: E0318 12:12:03.921243 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.421233827 +0000 UTC m=+143.971154466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:03 crc kubenswrapper[4921]: I0318 12:12:03.975404 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" podStartSLOduration=86.975383128 podStartE2EDuration="1m26.975383128s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:03.974411081 +0000 UTC m=+143.524331720" watchObservedRunningTime="2026-03-18 12:12:03.975383128 +0000 UTC m=+143.525303767" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.022008 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.022215 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.522193678 +0000 UTC m=+144.072114317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.022627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.022981 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.522968579 +0000 UTC m=+144.072889228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.078074 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" event={"ID":"8c5658a1-dd31-41cc-a12b-653abf972154","Type":"ContainerStarted","Data":"3f32d0f5d1ed83f52d10255e08e6f9c9aa678fe6127ca2f4e482c0fd92cd567a"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.085166 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" event={"ID":"ac8c868e-8218-4f20-a932-0285a5676da7","Type":"ContainerStarted","Data":"7916f84434fe818027164cbd1066c26172b8685fe46212c0fe5e1b198d95ccfa"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.097037 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rrdjn" event={"ID":"b92aecfc-3e54-4724-9766-45db6ce62f8a","Type":"ContainerStarted","Data":"119fc4e37225959440a2a45efc600fa51b66919bef7da5f0b76c490c9a8bb631"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.100407 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" podStartSLOduration=87.100389632 podStartE2EDuration="1m27.100389632s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:04.100196876 +0000 UTC m=+143.650117535" watchObservedRunningTime="2026-03-18 12:12:04.100389632 +0000 UTC m=+143.650310261" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.121722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" event={"ID":"428b5469-6fe4-4896-b861-05b406ed0ad6","Type":"ContainerStarted","Data":"6a4c26d51e816e224cd66ca5c727e9faddd80c80c59dd30bc6725a27ce9d8554"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.123358 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.123536 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.623506869 +0000 UTC m=+144.173427508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.123998 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.128324 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.628304751 +0000 UTC m=+144.178225400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.129028 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" event={"ID":"200c00bc-1280-4200-aca4-8a798728a787","Type":"ContainerStarted","Data":"d0ce3178742ae291b521981f05244e6858ec4232754c92f1b7a8a3b073497c22"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.142277 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" event={"ID":"d4f012fd-823f-400e-9af8-9a2264f18589","Type":"ContainerStarted","Data":"61b2e06f62f5487f98ba8db84ea1733882dc8de14590be2c95d957360e1c5d93"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.150000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" event={"ID":"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad","Type":"ContainerStarted","Data":"c9c48f4e25e7169c3501fd7c3b8ca1368c599985b623ebfdaf6e93cc9df6c4c5"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.150072 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" event={"ID":"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad","Type":"ContainerStarted","Data":"6368482f0713a600d0bc9e835408e9d6abb50f25b76a1413188d3990b1001d4e"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.172951 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" event={"ID":"6f991d20-33c4-457b-919b-736a93226768","Type":"ContainerStarted","Data":"6fdcb6755844b4b666d1301fe65fdc7a6e0eed8ef41ba66bf019a375654ab89d"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.173010 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" event={"ID":"6f991d20-33c4-457b-919b-736a93226768","Type":"ContainerStarted","Data":"3310dce2a0f62d078c1f2047c66993d4c957705a4b5756ef38dc11982d7720cc"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.195410 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" event={"ID":"3e4be723-81e1-4c74-a380-3ccd634a2f39","Type":"ContainerStarted","Data":"41e3dcb0a6c0ea970ec9a443607fd2e3688b37c7d3a685244d9f111ed5963329"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.219705 4921 generic.go:334] "Generic (PLEG): container finished" podID="43a2af0e-c364-40a7-b654-966a74211add" containerID="f3a2535b64d5cc5c97f252e1c3d5ee41c94c6101dbbfc4a247e2272dee5d514c" exitCode=0 Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.220085 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" event={"ID":"43a2af0e-c364-40a7-b654-966a74211add","Type":"ContainerDied","Data":"f3a2535b64d5cc5c97f252e1c3d5ee41c94c6101dbbfc4a247e2272dee5d514c"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.221361 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.2213426840000001 podStartE2EDuration="1.221342684s" podCreationTimestamp="2026-03-18 12:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:04.189526447 +0000 UTC m=+143.739447076" watchObservedRunningTime="2026-03-18 12:12:04.221342684 +0000 UTC m=+143.771263323" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.222228 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" podStartSLOduration=86.222222058 podStartE2EDuration="1m26.222222058s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:04.22086727 +0000 UTC m=+143.770787919" watchObservedRunningTime="2026-03-18 12:12:04.222222058 +0000 UTC m=+143.772142697" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.224802 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.224980 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.724968143 +0000 UTC m=+144.274888782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.225017 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.227707 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-52p2z" event={"ID":"356738b5-d7cc-4ce5-9bd8-1a45bf7630a5","Type":"ContainerStarted","Data":"3d6279243637a06eb60877fd468e2875fa737399a8156c5ecac2fb70184be8d0"} Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.230964 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.730940138 +0000 UTC m=+144.280860847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.247278 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkknx" event={"ID":"650c2009-2949-4ba8-ad59-8188c8a9523b","Type":"ContainerStarted","Data":"e252e1e38135808c2bab8d08cc4b75852dbacef15e023ba55f6f63fc02d1bf08"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.249662 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8s2wr" event={"ID":"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2","Type":"ContainerStarted","Data":"1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.269839 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" event={"ID":"d9afb9d3-e743-46d2-9d48-63807cbc84cf","Type":"ContainerStarted","Data":"08049aced46c0d43f0d11cb06639f42cb03cb5552fa4f44c386c00c0c4ac8885"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.269896 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" event={"ID":"d9afb9d3-e743-46d2-9d48-63807cbc84cf","Type":"ContainerStarted","Data":"65c84becc0f0bcbda07cc5b8f394631fe5d576b1c5343f7d36cc081a126b334d"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.272295 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" event={"ID":"b86f3c6a-1db9-44c8-911c-46647c933bd7","Type":"ContainerStarted","Data":"03aea77c2cd37bd5d66675459e5dad1b195f8a0796b425e136d23b03102f648b"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.272329 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" event={"ID":"b86f3c6a-1db9-44c8-911c-46647c933bd7","Type":"ContainerStarted","Data":"1e98147af892a629e15fa338997fc74e5cb83c9c519deeb8b68b5849c4d0a08f"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.284774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" event={"ID":"cbc87ffe-81eb-46ea-b7cd-d98c56df78c9","Type":"ContainerStarted","Data":"7e39b98f8bdb1c4b4673a483415b3b9c13b6c6991955a42edd286b3774227703"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.305523 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" event={"ID":"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a","Type":"ContainerStarted","Data":"fb3611e7f9479eeb665a4f844505c4743b2f2351136d5d32d59364ad908be7f1"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.308131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl"] Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.317634 4921 generic.go:334] "Generic (PLEG): container finished" podID="3888977f-a875-4d0b-85c1-fb3033156ef7" containerID="e4165a363f23e86bf1eb5bd2a2df423bf771a397c6c18d909372be9546cf0245" exitCode=0 Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.320989 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" event={"ID":"3888977f-a875-4d0b-85c1-fb3033156ef7","Type":"ContainerDied","Data":"e4165a363f23e86bf1eb5bd2a2df423bf771a397c6c18d909372be9546cf0245"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.329058 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.329827 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.829811852 +0000 UTC m=+144.379732491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.357672 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gfrt5" event={"ID":"c8f1ed20-40fc-4010-be16-03815fce6b82","Type":"ContainerStarted","Data":"71029c5e0d6e70594a2380dd4f8c13c81ac7b0796c664dfd9312be189f609219"} Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.378731 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.388720 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9qfns"] Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.433690 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.436403 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:04.936387547 +0000 UTC m=+144.486308266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.436587 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.537403 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.538368 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g8926" podStartSLOduration=86.538342306 podStartE2EDuration="1m26.538342306s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:04.501664366 +0000 UTC m=+144.051585015" watchObservedRunningTime="2026-03-18 12:12:04.538342306 +0000 UTC m=+144.088262945" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.541486 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.041468872 +0000 UTC m=+144.591389511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.572058 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5q4c" podStartSLOduration=87.572035694 podStartE2EDuration="1m27.572035694s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:04.571446908 +0000 UTC m=+144.121367547" watchObservedRunningTime="2026-03-18 12:12:04.572035694 +0000 UTC m=+144.121956333" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.642549 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.643193 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.143181114 +0000 UTC m=+144.693101753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.744650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.744885 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.244863535 +0000 UTC m=+144.794784174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.746403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.746401 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" podStartSLOduration=86.746376617 podStartE2EDuration="1m26.746376617s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:04.74612403 +0000 UTC m=+144.296044669" watchObservedRunningTime="2026-03-18 12:12:04.746376617 +0000 UTC m=+144.296297256" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.746790 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.246773908 +0000 UTC m=+144.796694547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.771695 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.795062 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:04 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:04 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:04 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.795154 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.853525 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.853862 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.353846437 +0000 UTC m=+144.903767076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:04 crc kubenswrapper[4921]: I0318 12:12:04.954545 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:04 crc kubenswrapper[4921]: E0318 12:12:04.954839 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.454825959 +0000 UTC m=+145.004746598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.056413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.056865 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.556845069 +0000 UTC m=+145.106765708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.158307 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.172566 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.672530826 +0000 UTC m=+145.222451465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.259706 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.260612 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.760585711 +0000 UTC m=+145.310506350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.335707 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-z5bqz" podStartSLOduration=87.33569039 podStartE2EDuration="1m27.33569039s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.333739497 +0000 UTC m=+144.883660136" watchObservedRunningTime="2026-03-18 12:12:05.33569039 +0000 UTC m=+144.885611029" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.365432 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.365863 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.865849391 +0000 UTC m=+145.415770030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.408519 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.420741 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" podStartSLOduration=87.420722083 podStartE2EDuration="1m27.420722083s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.418626495 +0000 UTC m=+144.968547134" watchObservedRunningTime="2026-03-18 12:12:05.420722083 +0000 UTC m=+144.970642732" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.443655 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz82p"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.468747 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.469775 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:05.969746573 +0000 UTC m=+145.519667212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.469814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9qfns" event={"ID":"d3f7009d-281a-4de9-8efd-17d35d4097a6","Type":"ContainerStarted","Data":"c421e6d754e299d40b36f68ee4d4040ec14565f1f417c542f253aa2e4f16d02c"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.494545 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.495457 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.517326 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.539704 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-52p2z" podStartSLOduration=87.539675739 podStartE2EDuration="1m27.539675739s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.511649657 +0000 UTC m=+145.061570296" watchObservedRunningTime="2026-03-18 12:12:05.539675739 +0000 UTC m=+145.089596388" Mar 18 12:12:05 crc kubenswrapper[4921]: W0318 12:12:05.549504 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71184a3d_1ecb_41e7_b7ed_9bc3e20131cb.slice/crio-f38862f052d8914637c0f08cdac6b0f6396408f0a51cca20f20a4e15c262098f WatchSource:0}: Error finding container f38862f052d8914637c0f08cdac6b0f6396408f0a51cca20f20a4e15c262098f: Status 404 returned error can't find the container with id f38862f052d8914637c0f08cdac6b0f6396408f0a51cca20f20a4e15c262098f Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.554996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" event={"ID":"0b6df968-4ab2-42c8-afd4-c3b5ecea8dad","Type":"ContainerStarted","Data":"db3aaf68406cc1eb709123eb8fac67255b9a8af898a79ec181a817efd3bd0c9d"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.562699 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.575077 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.578906 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" event={"ID":"80efc6d8-059c-47a2-90d4-0f3e0a31ac4a","Type":"ContainerStarted","Data":"d0495e41fa24620d583896254bd1c62f7e3c1d1b2cbd3704462c8ecd798260a7"} Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.583360 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.083334142 +0000 UTC m=+145.633254781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.585026 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xl9c9" podStartSLOduration=87.585005078 podStartE2EDuration="1m27.585005078s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.568561675 +0000 UTC m=+145.118482324" watchObservedRunningTime="2026-03-18 12:12:05.585005078 +0000 UTC m=+145.134925717" Mar 18 12:12:05 crc kubenswrapper[4921]: W0318 12:12:05.592984 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5d2390_ab7e_4012_9930_3578aff33f2f.slice/crio-7bc0e6e5e4ce712192ddda5429c8b9e478ddb3b8c2cc6e8316ca847c3532b0b2 WatchSource:0}: Error finding container 7bc0e6e5e4ce712192ddda5429c8b9e478ddb3b8c2cc6e8316ca847c3532b0b2: Status 404 returned error can't find the container with id 7bc0e6e5e4ce712192ddda5429c8b9e478ddb3b8c2cc6e8316ca847c3532b0b2 Mar 18 12:12:05 crc kubenswrapper[4921]: W0318 12:12:05.598065 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7205a33d_ffe1_447c_b1db_756842fcfb4d.slice/crio-65183520cf73022ac3bbc6c466a18839e7872463aa50dc9adf3497bacd142d4c WatchSource:0}: Error finding container 65183520cf73022ac3bbc6c466a18839e7872463aa50dc9adf3497bacd142d4c: Status 404 returned error can't find the container with id 65183520cf73022ac3bbc6c466a18839e7872463aa50dc9adf3497bacd142d4c Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.621187 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" event={"ID":"9073ce4b-8c2f-4f53-8d72-0984d42addf4","Type":"ContainerStarted","Data":"a1834c7d79000d1ae3cbeee49e445fff92d2a542b34d858a67704ed0da494550"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.637388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" event={"ID":"3e4be723-81e1-4c74-a380-3ccd634a2f39","Type":"ContainerStarted","Data":"74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.637620 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:05 crc kubenswrapper[4921]: W0318 12:12:05.652646 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb744db6_3732_400d_8939_2577d28e7cd5.slice/crio-b7bad6ac6f1cc09ed86330d5f24c802daace8c3d1bd7cbaa735c72420428e189 WatchSource:0}: Error finding container b7bad6ac6f1cc09ed86330d5f24c802daace8c3d1bd7cbaa735c72420428e189: Status 404 returned error can't find the container with id b7bad6ac6f1cc09ed86330d5f24c802daace8c3d1bd7cbaa735c72420428e189 Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.653369 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.679342 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v7tn2" podStartSLOduration=87.679312386 podStartE2EDuration="1m27.679312386s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.677841395 +0000 UTC m=+145.227762024" watchObservedRunningTime="2026-03-18 12:12:05.679312386 +0000 UTC m=+145.229233025" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.688827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" event={"ID":"ac8c868e-8218-4f20-a932-0285a5676da7","Type":"ContainerStarted","Data":"d7932d3fb03f5e1130196ce29f02f2b4a92457a85498c60705f5cd2608ab83fa"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.689824 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.696654 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.698196 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.198173645 +0000 UTC m=+145.748094284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.738096 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.752474 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" event={"ID":"200c00bc-1280-4200-aca4-8a798728a787","Type":"ContainerStarted","Data":"9b27b1dbd875e2600c261e7dbe1cc37accabb4ae6dc144cc75e0da36e98c238e"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.784440 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" event={"ID":"428b5469-6fe4-4896-b861-05b406ed0ad6","Type":"ContainerStarted","Data":"71d061119d4985b28c2fc62d6ce65ab3262657898e32e3eb44e3878e62d93d80"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.785692 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.791390 4921 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9g4bd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.791456 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" podUID="428b5469-6fe4-4896-b861-05b406ed0ad6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.797413 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:05 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:05 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:05 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.797475 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.800999 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4lm"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.802069 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.802443 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.302427627 +0000 UTC m=+145.852348266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: W0318 12:12:05.815360 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66052ca8_49ea_4ab8_9b3a_eaffd9feda3f.slice/crio-ab8d267ced7de586f6343fe9c312a12d456b0beb5097e3422c1b95f59ff5af3e WatchSource:0}: Error finding container ab8d267ced7de586f6343fe9c312a12d456b0beb5097e3422c1b95f59ff5af3e: Status 404 returned error can't find the container with id ab8d267ced7de586f6343fe9c312a12d456b0beb5097e3422c1b95f59ff5af3e Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.823934 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vxtr2"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.829774 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hck2l" podStartSLOduration=87.82975061 podStartE2EDuration="1m27.82975061s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.815706513 +0000 UTC m=+145.365627152" watchObservedRunningTime="2026-03-18 12:12:05.82975061 +0000 UTC m=+145.379671269" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.899951 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8s2wr" podStartSLOduration=88.899931043 podStartE2EDuration="1m28.899931043s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.874523513 +0000 UTC m=+145.424444162" watchObservedRunningTime="2026-03-18 12:12:05.899931043 +0000 UTC m=+145.449851682" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.901921 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qcvdl"] Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.925655 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:05 crc kubenswrapper[4921]: E0318 12:12:05.925949 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.425923459 +0000 UTC m=+145.975844098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.929828 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" event={"ID":"3888977f-a875-4d0b-85c1-fb3033156ef7","Type":"ContainerStarted","Data":"da7225c823da413dc2b2cff04358fc92f3f45ba7a1dd49d1e8b47bd7b1cd74e1"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.930022 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.949547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rrdjn" event={"ID":"b92aecfc-3e54-4724-9766-45db6ce62f8a","Type":"ContainerStarted","Data":"4891a5a108d30c8c18377c3a854d1aca270a9f680ff8d822b258db745a975311"} Mar 18 12:12:05 crc kubenswrapper[4921]: I0318 12:12:05.974149 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dk62n" podStartSLOduration=87.974132427 podStartE2EDuration="1m27.974132427s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:05.973964893 +0000 UTC m=+145.523885532" watchObservedRunningTime="2026-03-18 12:12:05.974132427 +0000 UTC m=+145.524053056" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.031248 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.031588 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.5315757 +0000 UTC m=+146.081496339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.040206 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.060630 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p2tjp"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.064498 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" podStartSLOduration=6.064481956 podStartE2EDuration="6.064481956s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.04356769 +0000 UTC m=+145.593488319" watchObservedRunningTime="2026-03-18 12:12:06.064481956 +0000 UTC m=+145.614402585" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.068968 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dkdbz"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.114028 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" podStartSLOduration=88.1140091 podStartE2EDuration="1m28.1140091s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.103897822 +0000 UTC m=+145.653818461" watchObservedRunningTime="2026-03-18 12:12:06.1140091 +0000 UTC m=+145.663929739" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.119675 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.122673 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.137165 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.140085 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.140335 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.640316505 +0000 UTC m=+146.190237154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.140696 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.141091 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.641078206 +0000 UTC m=+146.190998845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.159302 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-swhkn"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.167038 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" podStartSLOduration=88.16700485 podStartE2EDuration="1m28.16700485s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.138534746 +0000 UTC m=+145.688455385" watchObservedRunningTime="2026-03-18 12:12:06.16700485 +0000 UTC m=+145.716925489" Mar 18 12:12:06 crc kubenswrapper[4921]: W0318 12:12:06.181202 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef51394_c324_416d_911b_170f10288c76.slice/crio-1996e45376cc1fe29fa11edd541cbb8633c780defc722d71bb03ed01f6850f33 WatchSource:0}: Error finding container 1996e45376cc1fe29fa11edd541cbb8633c780defc722d71bb03ed01f6850f33: Status 404 returned error can't find the container with id 1996e45376cc1fe29fa11edd541cbb8633c780defc722d71bb03ed01f6850f33 Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.183199 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" podStartSLOduration=88.183183556 podStartE2EDuration="1m28.183183556s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.181691665 +0000 UTC m=+145.731612304" watchObservedRunningTime="2026-03-18 12:12:06.183183556 +0000 UTC m=+145.733104195" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.195044 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-b8gp7"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.210849 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-747dt"] Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.242664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.243054 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.743037125 +0000 UTC m=+146.292957764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.280198 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rrdjn" podStartSLOduration=6.280168486 podStartE2EDuration="6.280168486s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.274704426 +0000 UTC m=+145.824625065" watchObservedRunningTime="2026-03-18 12:12:06.280168486 +0000 UTC m=+145.830089195" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.281373 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tvvpg" podStartSLOduration=89.28136338 podStartE2EDuration="1m29.28136338s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.229953534 +0000 UTC m=+145.779874173" watchObservedRunningTime="2026-03-18 12:12:06.28136338 +0000 UTC m=+145.831284019" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.345426 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.346358 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.846334929 +0000 UTC m=+146.396255738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: W0318 12:12:06.424776 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff38cba_2769_48cf_98e2_4946ab75a1d7.slice/crio-4d5c5ec90790dc44febcbfb7f4a587730198f85c8db68eb8482f477b8524d724 WatchSource:0}: Error finding container 4d5c5ec90790dc44febcbfb7f4a587730198f85c8db68eb8482f477b8524d724: Status 404 returned error can't find the container with id 4d5c5ec90790dc44febcbfb7f4a587730198f85c8db68eb8482f477b8524d724 Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.431522 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.446859 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.447366 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:06.947349642 +0000 UTC m=+146.497270281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.549135 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.549483 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.049468285 +0000 UTC m=+146.599388934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.649771 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.650103 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.150087597 +0000 UTC m=+146.700008236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.754865 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.755417 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.255402558 +0000 UTC m=+146.805323197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.776923 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:06 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:06 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:06 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.777000 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.855533 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.856224 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.356196994 +0000 UTC m=+146.906117623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.895926 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.896378 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.921152 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.958180 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:06 crc kubenswrapper[4921]: E0318 12:12:06.958721 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.458706088 +0000 UTC m=+147.008626727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:06 crc kubenswrapper[4921]: I0318 12:12:06.960881 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" podStartSLOduration=89.960852867 podStartE2EDuration="1m29.960852867s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:06.314096841 +0000 UTC m=+145.864017480" watchObservedRunningTime="2026-03-18 12:12:06.960852867 +0000 UTC m=+146.510773506" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.019524 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" event={"ID":"eac88057-b7cd-4264-861c-b7d53340338d","Type":"ContainerStarted","Data":"5475247ab74b8b063f1543e97a2653e0e958804ae4440787352871d8ca68e103"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.021442 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-747dt" event={"ID":"cff38cba-2769-48cf-98e2-4946ab75a1d7","Type":"ContainerStarted","Data":"4d5c5ec90790dc44febcbfb7f4a587730198f85c8db68eb8482f477b8524d724"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.044575 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" event={"ID":"b00ba244-0295-4065-a77a-92a947e70d4b","Type":"ContainerStarted","Data":"f4ec5a44d165430750da60ec2b94ecbc1f0f9eff3a9954a2510fc872a7db063a"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.047152 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" event={"ID":"714f8a69-365e-4825-bc9f-5344b2d36e21","Type":"ContainerStarted","Data":"c058d1a42af3c5c8b201a67154cfd91a86d707cb6d61c3f3707e5533befd2e18"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.047234 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" event={"ID":"714f8a69-365e-4825-bc9f-5344b2d36e21","Type":"ContainerStarted","Data":"8b40a6f265e9942c5d2b29f47f1acdad6ee701c9b867f19476f471f80f0344ae"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.054529 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4qdcl" event={"ID":"200c00bc-1280-4200-aca4-8a798728a787","Type":"ContainerStarted","Data":"6252b8ac2f212f76722da1042c37ba568aebf98bc9f5649f084aeb082bea2d78"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.059069 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.060002 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.559951967 +0000 UTC m=+147.109872606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.081344 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" event={"ID":"80fb60fe-c806-49ed-9f66-b0c0807cb40a","Type":"ContainerStarted","Data":"fae28bf9f4649ff12263434bb3264d5076ff8a05ed5e0f1457b401ba61b35c98"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.081397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" event={"ID":"80fb60fe-c806-49ed-9f66-b0c0807cb40a","Type":"ContainerStarted","Data":"fae0ae2413cbe359776a87ee1598ab5b8522136eaf4855c19902941b9d15c8fd"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.082173 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.086662 4921 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kmct7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.086730 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" podUID="80fb60fe-c806-49ed-9f66-b0c0807cb40a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.089979 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qcvdl" podStartSLOduration=89.089953153 podStartE2EDuration="1m29.089953153s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.082378875 +0000 UTC m=+146.632299514" watchObservedRunningTime="2026-03-18 12:12:07.089953153 +0000 UTC m=+146.639873782" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.098569 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" event={"ID":"2827febe-00a0-48cf-8170-be2ec745f4e8","Type":"ContainerStarted","Data":"93da298dee13aa63c3fa0d7b6079f6428435daa4b0ea99c8dcd27aedd0881fc5"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.101876 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" event={"ID":"5ef51394-c324-416d-911b-170f10288c76","Type":"ContainerStarted","Data":"1996e45376cc1fe29fa11edd541cbb8633c780defc722d71bb03ed01f6850f33"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.107542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" event={"ID":"ac8c868e-8218-4f20-a932-0285a5676da7","Type":"ContainerStarted","Data":"2d7e2b5da14bb86550e86140f9a4f358abadd0797cfe8508f63cd8f1bc5dccb4"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.110018 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" podStartSLOduration=89.110001176 podStartE2EDuration="1m29.110001176s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.107806105 +0000 UTC m=+146.657726744" watchObservedRunningTime="2026-03-18 12:12:07.110001176 +0000 UTC m=+146.659921815" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.116216 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p2tjp" event={"ID":"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a","Type":"ContainerStarted","Data":"5296df957ea644b34158510b6595b6cc5803483f2266302aa95e97066eb576c4"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.162522 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.165894 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.665852624 +0000 UTC m=+147.215773263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.198657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" event={"ID":"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f","Type":"ContainerStarted","Data":"ab8d267ced7de586f6343fe9c312a12d456b0beb5097e3422c1b95f59ff5af3e"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.263689 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.264646 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.764629455 +0000 UTC m=+147.314550094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.288156 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" event={"ID":"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb","Type":"ContainerStarted","Data":"83eb2432f209eb0538cf4f52cd89c378357dac47fc64039e89a0af4c8f9bb39a"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.293139 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" event={"ID":"71184a3d-1ecb-41e7-b7ed-9bc3e20131cb","Type":"ContainerStarted","Data":"f38862f052d8914637c0f08cdac6b0f6396408f0a51cca20f20a4e15c262098f"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.293328 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" event={"ID":"5b790835-2b38-4cfe-b490-d7c9638b96ec","Type":"ContainerStarted","Data":"40da7e455ce6068dc7ebc4078ab682d45d8b3dedec7fc4f41cb0def3e3c0d8b8"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.298240 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" event={"ID":"dcb29e5f-85d3-44b2-81f3-61b33e007475","Type":"ContainerStarted","Data":"127a8b7a6e7f7173b5b0520241ce77e3b54eb2064747177fb672a7267e33b843"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.311491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" event={"ID":"6e5d2390-ab7e-4012-9930-3578aff33f2f","Type":"ContainerStarted","Data":"b8c03c676a1add3a825f8a75f6ec3330f2116f7c691fea52a485f3939a560696"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.311553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" event={"ID":"6e5d2390-ab7e-4012-9930-3578aff33f2f","Type":"ContainerStarted","Data":"7bc0e6e5e4ce712192ddda5429c8b9e478ddb3b8c2cc6e8316ca847c3532b0b2"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.317815 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qsrjd" podStartSLOduration=89.3177961 podStartE2EDuration="1m29.3177961s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.315076195 +0000 UTC m=+146.864996834" watchObservedRunningTime="2026-03-18 12:12:07.3177961 +0000 UTC m=+146.867716739" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.338139 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" event={"ID":"c33a3955-f7a2-4d3b-9cf1-83872a2d4670","Type":"ContainerStarted","Data":"71e6368930bfc763388ef40149c088ba82d773a56c5e48f28a9883eec0883ae0"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.338198 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" event={"ID":"c33a3955-f7a2-4d3b-9cf1-83872a2d4670","Type":"ContainerStarted","Data":"ac4deb23fae3ebf6b723987108c4cb4a8b0c8bf12d98109e8330a1352c513bd4"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.347346 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" podStartSLOduration=89.347327313 podStartE2EDuration="1m29.347327313s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.345685978 +0000 UTC m=+146.895606617" watchObservedRunningTime="2026-03-18 12:12:07.347327313 +0000 UTC m=+146.897247952" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.363421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" event={"ID":"58d7472e-2ed9-434f-b1f0-6147f9452a11","Type":"ContainerStarted","Data":"b58adb2adfaae33ec71d29bdb911dbe60f82bdc127af08eeb13a40956aef0e45"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.373495 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.373928 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.873914086 +0000 UTC m=+147.423834725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.389218 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9qfns" event={"ID":"d3f7009d-281a-4de9-8efd-17d35d4097a6","Type":"ContainerStarted","Data":"8b3522656cd8bed21e4cd8ab02e5b52a203d83cfccf6a0b22b47436c0f634480"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.390287 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.428288 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" event={"ID":"fb744db6-3732-400d-8939-2577d28e7cd5","Type":"ContainerStarted","Data":"65d7dc68c785cfb8dbd858a92ba73968ae8a9c262d45508300c51b2dfc95d3e9"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.433195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" event={"ID":"fb744db6-3732-400d-8939-2577d28e7cd5","Type":"ContainerStarted","Data":"b7bad6ac6f1cc09ed86330d5f24c802daace8c3d1bd7cbaa735c72420428e189"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.455449 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-swhkn" event={"ID":"1704e345-c317-44b3-89a3-0a7bdd9dd901","Type":"ContainerStarted","Data":"6a777e81fa7b1243553bcd290c3010f77dcb6fe069b913fef025ebe556beec6c"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.459209 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.474771 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.475720 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:07.97570454 +0000 UTC m=+147.525625179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.478560 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" podStartSLOduration=89.478546358 podStartE2EDuration="1m29.478546358s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.42781373 +0000 UTC m=+146.977734369" watchObservedRunningTime="2026-03-18 12:12:07.478546358 +0000 UTC m=+147.028466997" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.478660 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9qfns" podStartSLOduration=90.478656651 podStartE2EDuration="1m30.478656651s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.47827573 +0000 UTC m=+147.028196369" watchObservedRunningTime="2026-03-18 12:12:07.478656651 +0000 UTC m=+147.028577290" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.488815 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-swhkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.488869 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-swhkn" podUID="1704e345-c317-44b3-89a3-0a7bdd9dd901" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.506059 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" event={"ID":"300a483a-7841-4135-ad9f-3ff45b6cef74","Type":"ContainerStarted","Data":"2925f46ed9a630da35d4bde95f5e004551908e8c5c6eeae08436c18745fbf6a2"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.525064 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" event={"ID":"9073ce4b-8c2f-4f53-8d72-0984d42addf4","Type":"ContainerStarted","Data":"ac6cd84dff4d8e75379adf3f422978d9686403ddfe6d2bb2c4a514462e24839d"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.525128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" event={"ID":"9073ce4b-8c2f-4f53-8d72-0984d42addf4","Type":"ContainerStarted","Data":"0dda519cb0516223e30452a1099d9c8690eafb153df943215539c4c6b73180b3"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.549764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" event={"ID":"89da31ea-f077-477f-a9c0-52cc265c53c0","Type":"ContainerStarted","Data":"d943abc2b17d52c94dc8709e3e6f203db66b55dbcb813710a80f87522b799d7c"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.549821 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" event={"ID":"89da31ea-f077-477f-a9c0-52cc265c53c0","Type":"ContainerStarted","Data":"bff0615eb2ab5cdf2c791e524716af014d0df34a55d7bb5e13860a217ea1a709"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.580032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.581560 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.081547765 +0000 UTC m=+147.631468394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.606636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" event={"ID":"43a2af0e-c364-40a7-b654-966a74211add","Type":"ContainerStarted","Data":"02b974ac79a00c800f6d1ff5d4370114331ff439ab8b392284b15348db82e6c1"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.606719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" event={"ID":"43a2af0e-c364-40a7-b654-966a74211add","Type":"ContainerStarted","Data":"f517491598a377a97a887f4ded44e174ef11caf94577ce8c5f8f84bf63c361c7"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.622467 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" event={"ID":"7205a33d-ffe1-447c-b1db-756842fcfb4d","Type":"ContainerStarted","Data":"f4b91ce515379e77672f494b0a44cb0b6f3407484284b5ac7d9016f3126c151e"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.622533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" event={"ID":"7205a33d-ffe1-447c-b1db-756842fcfb4d","Type":"ContainerStarted","Data":"65183520cf73022ac3bbc6c466a18839e7872463aa50dc9adf3497bacd142d4c"} Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.626649 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.635227 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jf9wv" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.639787 4921 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kz82p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.639886 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.659184 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-swhkn" podStartSLOduration=89.658224308 podStartE2EDuration="1m29.658224308s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.643319697 +0000 UTC m=+147.193240336" watchObservedRunningTime="2026-03-18 12:12:07.658224308 +0000 UTC m=+147.208144957" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.660694 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" podStartSLOduration=90.660685895 podStartE2EDuration="1m30.660685895s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.60349516 +0000 UTC m=+147.153415799" watchObservedRunningTime="2026-03-18 12:12:07.660685895 +0000 UTC m=+147.210606534" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.681586 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.687347 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.187321479 +0000 UTC m=+147.737242118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.687981 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.691435 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.191393151 +0000 UTC m=+147.741313790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.711253 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9qfns" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.761882 4921 ???:1] "http: TLS handshake error from 192.168.126.11:57724: no serving certificate available for the kubelet" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.765765 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jj9nh" podStartSLOduration=89.765746689 podStartE2EDuration="1m29.765746689s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.764929017 +0000 UTC m=+147.314849666" watchObservedRunningTime="2026-03-18 12:12:07.765746689 +0000 UTC m=+147.315667338" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.767684 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" podStartSLOduration=90.767666572 podStartE2EDuration="1m30.767666572s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.705404137 +0000 UTC m=+147.255324766" watchObservedRunningTime="2026-03-18 12:12:07.767666572 +0000 UTC m=+147.317587211" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.783404 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9g4bd" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.784596 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:07 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:07 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:07 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.784668 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.807673 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.808134 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.308097286 +0000 UTC m=+147.858017925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.874997 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.896863 4921 ???:1] "http: TLS handshake error from 192.168.126.11:57740: no serving certificate available for the kubelet" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.918518 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5n9xl" podStartSLOduration=89.918490317 podStartE2EDuration="1m29.918490317s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.804898178 +0000 UTC m=+147.354818817" watchObservedRunningTime="2026-03-18 12:12:07.918490317 +0000 UTC m=+147.468410956" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.921544 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" podStartSLOduration=90.921534951 podStartE2EDuration="1m30.921534951s" podCreationTimestamp="2026-03-18 12:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:07.887602576 +0000 UTC m=+147.437523235" watchObservedRunningTime="2026-03-18 12:12:07.921534951 +0000 UTC m=+147.471455590" Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.935001 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:07 crc kubenswrapper[4921]: E0318 12:12:07.935642 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.435628509 +0000 UTC m=+147.985549148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.952127 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pwwdq"] Mar 18 12:12:07 crc kubenswrapper[4921]: I0318 12:12:07.952347 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" podUID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" containerName="controller-manager" containerID="cri-o://b0a22434e737509a67e3a8cf613166d56503e8b0799f15c15be9fad40cd30343" gracePeriod=30 Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.036785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.037057 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.537029782 +0000 UTC m=+148.086950431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.138904 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.139278 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.639262779 +0000 UTC m=+148.189183418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.195927 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c"] Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.202652 4921 ???:1] "http: TLS handshake error from 192.168.126.11:57748: no serving certificate available for the kubelet" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.239630 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.239815 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.739786698 +0000 UTC m=+148.289707337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.239891 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.240272 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.740260161 +0000 UTC m=+148.290180800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.340694 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.340852 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.840830991 +0000 UTC m=+148.390751650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.341045 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.341376 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.841365646 +0000 UTC m=+148.391286295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.354741 4921 ???:1] "http: TLS handshake error from 192.168.126.11:57764: no serving certificate available for the kubelet" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.442391 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.442614 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.942582654 +0000 UTC m=+148.492503293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.442711 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.443155 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:08.943134389 +0000 UTC m=+148.493055088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.462391 4921 ???:1] "http: TLS handshake error from 192.168.126.11:57774: no serving certificate available for the kubelet" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.543800 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.544275 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.044244644 +0000 UTC m=+148.594165293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.544403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.544740 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.044712767 +0000 UTC m=+148.594633406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.582492 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56922: no serving certificate available for the kubelet" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.646010 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.646388 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.146357997 +0000 UTC m=+148.696278646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.667783 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2m8pm" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.672794 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" event={"ID":"eac88057-b7cd-4264-861c-b7d53340338d","Type":"ContainerStarted","Data":"99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.674043 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.677156 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-747dt" event={"ID":"cff38cba-2769-48cf-98e2-4946ab75a1d7","Type":"ContainerStarted","Data":"8073e4466670cd1f0c460ebfe0ec408eb910988bbcf1865db4d0ce40e758428e"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.677190 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-747dt" event={"ID":"cff38cba-2769-48cf-98e2-4946ab75a1d7","Type":"ContainerStarted","Data":"94b34be05db3857a047407fb1d712fff84900f9ef4206b620339a826adc88590"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.677806 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-747dt" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.686009 4921 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5j4lm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.686074 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" podUID="eac88057-b7cd-4264-861c-b7d53340338d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.688961 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" event={"ID":"b00ba244-0295-4065-a77a-92a947e70d4b","Type":"ContainerStarted","Data":"580b28c172a2b6bdb6ab305ffa10db2dda613276fc0657237b6f1fbb065ca257"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.711441 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" event={"ID":"66052ca8-49ea-4ab8-9b3a-eaffd9feda3f","Type":"ContainerStarted","Data":"ef7dba14bb3d287efa740d429fa16328f42e71018e09466b8bdd3b038c8859fe"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.711521 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56930: no serving certificate available for the kubelet" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.734419 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" event={"ID":"5b790835-2b38-4cfe-b490-d7c9638b96ec","Type":"ContainerStarted","Data":"81ce068fb8c2cdfd6db77b90a181b4e06960b94c172d5f43e9640a7e18df92ab"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.734479 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" event={"ID":"5b790835-2b38-4cfe-b490-d7c9638b96ec","Type":"ContainerStarted","Data":"baef3dd90a117fb84dca5a050bb1f892a36e8b344e08f8b1d71047f06d73a2c5"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.750801 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.753830 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.253813157 +0000 UTC m=+148.803733866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.762436 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" event={"ID":"2827febe-00a0-48cf-8170-be2ec745f4e8","Type":"ContainerStarted","Data":"92ae8facacb5acb0ce36dc882eb289cdc398f9de1009e2f762445147c32eddc2"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.763390 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.769528 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2trf4" podStartSLOduration=90.7695097 podStartE2EDuration="1m30.7695097s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.735230386 +0000 UTC m=+148.285151045" watchObservedRunningTime="2026-03-18 12:12:08.7695097 +0000 UTC m=+148.319430339" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.770224 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" podStartSLOduration=90.770217749 podStartE2EDuration="1m30.770217749s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.767753071 +0000 UTC m=+148.317673710" watchObservedRunningTime="2026-03-18 12:12:08.770217749 +0000 UTC m=+148.320138388" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.783987 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:08 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:08 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:08 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.784054 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.784429 4921 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-tcb2z container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.784479 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" podUID="2827febe-00a0-48cf-8170-be2ec745f4e8" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.797996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jlph4" event={"ID":"c33a3955-f7a2-4d3b-9cf1-83872a2d4670","Type":"ContainerStarted","Data":"85f1d990785f804dd2d1a64bf3020935879149f153d61e6a877278966551f657"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.811635 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" event={"ID":"5ef51394-c324-416d-911b-170f10288c76","Type":"ContainerStarted","Data":"284850cb4f9ca5fd1767b6d0032a6893b86c8507e57845a8de139ce734deb1e2"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.811679 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" event={"ID":"5ef51394-c324-416d-911b-170f10288c76","Type":"ContainerStarted","Data":"f8dc1ea14adbe1ae2d84a33ec60e5c4a59a60953388fdd399d01b98180c60ca3"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.835427 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-747dt" podStartSLOduration=8.835410265 podStartE2EDuration="8.835410265s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.819821196 +0000 UTC m=+148.369741835" watchObservedRunningTime="2026-03-18 12:12:08.835410265 +0000 UTC m=+148.385330914" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.835941 4921 generic.go:334] "Generic (PLEG): container finished" podID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" containerID="b0a22434e737509a67e3a8cf613166d56503e8b0799f15c15be9fad40cd30343" exitCode=0 Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.836027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" event={"ID":"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf","Type":"ContainerDied","Data":"b0a22434e737509a67e3a8cf613166d56503e8b0799f15c15be9fad40cd30343"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.837458 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5pf85"] Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.843051 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56942: no serving certificate available for the kubelet" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.844164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-swhkn" event={"ID":"1704e345-c317-44b3-89a3-0a7bdd9dd901","Type":"ContainerStarted","Data":"6cdd6a9619afa83e955b6374e10840a3474776572f94e4475cb5d7979a25a584"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.845179 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-swhkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.845216 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-swhkn" podUID="1704e345-c317-44b3-89a3-0a7bdd9dd901" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.855021 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.856583 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.356561178 +0000 UTC m=+148.906481817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.860131 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" podStartSLOduration=90.860100405 podStartE2EDuration="1m30.860100405s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.858419589 +0000 UTC m=+148.408340228" watchObservedRunningTime="2026-03-18 12:12:08.860100405 +0000 UTC m=+148.410021044" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.868269 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" event={"ID":"dcb29e5f-85d3-44b2-81f3-61b33e007475","Type":"ContainerStarted","Data":"97c1b9d2c30d42acfd021598510284071a11e2d76744c2cfe46c380f5b23b7da"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.877285 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dsk24" event={"ID":"6e5d2390-ab7e-4012-9930-3578aff33f2f","Type":"ContainerStarted","Data":"dcc64b9476e4d52cd01b691bc9db42ae05a6a1a700727e368b026cf95089cd23"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.886728 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-69gx7" podStartSLOduration=90.886711858 podStartE2EDuration="1m30.886711858s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.885162696 +0000 UTC m=+148.435083335" watchObservedRunningTime="2026-03-18 12:12:08.886711858 +0000 UTC m=+148.436632497" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.895635 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p2tjp" event={"ID":"11a9e5fc-a526-4bf5-acb8-8d9d9b1efa3a","Type":"ContainerStarted","Data":"2945f8dd290c5a3e21bb91100e7a67fce32c287179540cdaa8b2992065f95c86"} Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.895791 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" podUID="b98da0c2-cddf-4701-8703-6821fe2bb520" containerName="route-controller-manager" containerID="cri-o://bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473" gracePeriod=30 Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.910708 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kmct7" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.932664 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vxtr2" podStartSLOduration=90.932643824 podStartE2EDuration="1m30.932643824s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.930205186 +0000 UTC m=+148.480125835" watchObservedRunningTime="2026-03-18 12:12:08.932643824 +0000 UTC m=+148.482564463" Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.961867 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:08 crc kubenswrapper[4921]: E0318 12:12:08.965794 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.465773956 +0000 UTC m=+149.015694595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:08 crc kubenswrapper[4921]: I0318 12:12:08.985747 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7j6jq" podStartSLOduration=90.985723916 podStartE2EDuration="1m30.985723916s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:08.960136331 +0000 UTC m=+148.510056970" watchObservedRunningTime="2026-03-18 12:12:08.985723916 +0000 UTC m=+148.535644555" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.063776 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.064421 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.564399883 +0000 UTC m=+149.114320522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.072933 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4z68x" podStartSLOduration=91.072909627 podStartE2EDuration="1m31.072909627s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:09.056983499 +0000 UTC m=+148.606904148" watchObservedRunningTime="2026-03-18 12:12:09.072909627 +0000 UTC m=+148.622830266" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.132401 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.168928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.169330 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.669317693 +0000 UTC m=+149.219238332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.177523 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p2tjp" podStartSLOduration=9.177506689 podStartE2EDuration="9.177506689s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:09.120132108 +0000 UTC m=+148.670052757" watchObservedRunningTime="2026-03-18 12:12:09.177506689 +0000 UTC m=+148.727427328" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.269767 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-config\") pod \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.269864 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-client-ca\") pod \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.269910 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-serving-cert\") pod \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.269963 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-proxy-ca-bundles\") pod \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.270008 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgcgc\" (UniqueName: \"kubernetes.io/projected/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-kube-api-access-mgcgc\") pod \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\" (UID: \"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.270171 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.270538 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.770520411 +0000 UTC m=+149.320441050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.271525 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-config" (OuterVolumeSpecName: "config") pod "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" (UID: "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.271808 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-client-ca" (OuterVolumeSpecName: "client-ca") pod "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" (UID: "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.272974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" (UID: "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.280761 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" (UID: "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.283301 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-kube-api-access-mgcgc" (OuterVolumeSpecName: "kube-api-access-mgcgc") pod "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" (UID: "ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf"). InnerVolumeSpecName "kube-api-access-mgcgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.371320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.371437 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.371452 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgcgc\" (UniqueName: \"kubernetes.io/projected/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-kube-api-access-mgcgc\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.371467 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.371480 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.371491 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.371806 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.871784191 +0000 UTC m=+149.421704830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.472515 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.472912 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.972894256 +0000 UTC m=+149.522814895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.490540 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.573929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.574426 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.074411952 +0000 UTC m=+149.624332591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.581641 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56954: no serving certificate available for the kubelet" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.583264 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.677151 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-config\") pod \"b98da0c2-cddf-4701-8703-6821fe2bb520\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.677356 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.677396 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98da0c2-cddf-4701-8703-6821fe2bb520-serving-cert\") pod \"b98da0c2-cddf-4701-8703-6821fe2bb520\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.677437 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrxk4\" (UniqueName: \"kubernetes.io/projected/b98da0c2-cddf-4701-8703-6821fe2bb520-kube-api-access-vrxk4\") pod \"b98da0c2-cddf-4701-8703-6821fe2bb520\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.677464 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-client-ca\") pod \"b98da0c2-cddf-4701-8703-6821fe2bb520\" (UID: \"b98da0c2-cddf-4701-8703-6821fe2bb520\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.681467 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-config" (OuterVolumeSpecName: "config") pod "b98da0c2-cddf-4701-8703-6821fe2bb520" (UID: "b98da0c2-cddf-4701-8703-6821fe2bb520"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.682251 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.182227762 +0000 UTC m=+149.732148411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.683678 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-client-ca" (OuterVolumeSpecName: "client-ca") pod "b98da0c2-cddf-4701-8703-6821fe2bb520" (UID: "b98da0c2-cddf-4701-8703-6821fe2bb520"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.691018 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98da0c2-cddf-4701-8703-6821fe2bb520-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b98da0c2-cddf-4701-8703-6821fe2bb520" (UID: "b98da0c2-cddf-4701-8703-6821fe2bb520"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.701438 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98da0c2-cddf-4701-8703-6821fe2bb520-kube-api-access-vrxk4" (OuterVolumeSpecName: "kube-api-access-vrxk4") pod "b98da0c2-cddf-4701-8703-6821fe2bb520" (UID: "b98da0c2-cddf-4701-8703-6821fe2bb520"). InnerVolumeSpecName "kube-api-access-vrxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.701521 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc"] Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.701774 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" containerName="controller-manager" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.701797 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" containerName="controller-manager" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.701831 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98da0c2-cddf-4701-8703-6821fe2bb520" containerName="route-controller-manager" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.701839 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98da0c2-cddf-4701-8703-6821fe2bb520" containerName="route-controller-manager" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.701932 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" containerName="controller-manager" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.701951 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98da0c2-cddf-4701-8703-6821fe2bb520" containerName="route-controller-manager" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.702416 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.711199 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f9455c889-9jlf8"] Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.712337 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.720272 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc"] Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.729288 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9455c889-9jlf8"] Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.788966 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:09 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:09 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:09 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789033 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789702 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3038ab-db0f-4b83-8251-e8391578c76c-serving-cert\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk42j\" (UniqueName: \"kubernetes.io/projected/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-kube-api-access-pk42j\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789820 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789841 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-client-ca\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789873 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-client-ca\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-config\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789948 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2x7\" (UniqueName: \"kubernetes.io/projected/1c3038ab-db0f-4b83-8251-e8391578c76c-kube-api-access-xj2x7\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.789994 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-serving-cert\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.790020 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-config\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.790042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-proxy-ca-bundles\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.790095 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.790126 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98da0c2-cddf-4701-8703-6821fe2bb520-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.790140 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrxk4\" (UniqueName: \"kubernetes.io/projected/b98da0c2-cddf-4701-8703-6821fe2bb520-kube-api-access-vrxk4\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.790151 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98da0c2-cddf-4701-8703-6821fe2bb520-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.790508 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.290493455 +0000 UTC m=+149.840414114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891156 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891469 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-serving-cert\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-config\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891539 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-proxy-ca-bundles\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891573 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3038ab-db0f-4b83-8251-e8391578c76c-serving-cert\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891636 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk42j\" (UniqueName: \"kubernetes.io/projected/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-kube-api-access-pk42j\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891705 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-client-ca\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-client-ca\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891768 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-config\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.891801 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2x7\" (UniqueName: \"kubernetes.io/projected/1c3038ab-db0f-4b83-8251-e8391578c76c-kube-api-access-xj2x7\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: E0318 12:12:09.892254 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.392235716 +0000 UTC m=+149.942156365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.894523 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-config\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.894623 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-config\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.895322 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-client-ca\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.896044 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-client-ca\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.896386 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-serving-cert\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.897236 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3038ab-db0f-4b83-8251-e8391578c76c-serving-cert\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.897812 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-proxy-ca-bundles\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.919143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" event={"ID":"ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf","Type":"ContainerDied","Data":"36923bb41e18bda6e98722f632d85243f9ef9baeed94b013ba05fb4f3029a981"} Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.919199 4921 scope.go:117] "RemoveContainer" containerID="b0a22434e737509a67e3a8cf613166d56503e8b0799f15c15be9fad40cd30343" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.919329 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pwwdq" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.938215 4921 generic.go:334] "Generic (PLEG): container finished" podID="b98da0c2-cddf-4701-8703-6821fe2bb520" containerID="bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473" exitCode=0 Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.938572 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" event={"ID":"b98da0c2-cddf-4701-8703-6821fe2bb520","Type":"ContainerDied","Data":"bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473"} Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.938584 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.938606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c" event={"ID":"b98da0c2-cddf-4701-8703-6821fe2bb520","Type":"ContainerDied","Data":"d2693bee1574ea26c06e3a11e7b4fef0cf4d0f221a0f12eda5498e8b0658e706"} Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.938641 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk42j\" (UniqueName: \"kubernetes.io/projected/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-kube-api-access-pk42j\") pod \"route-controller-manager-7885789c5f-c7fhc\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.943813 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2x7\" (UniqueName: \"kubernetes.io/projected/1c3038ab-db0f-4b83-8251-e8391578c76c-kube-api-access-xj2x7\") pod \"controller-manager-7f9455c889-9jlf8\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.949670 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" event={"ID":"300a483a-7841-4135-ad9f-3ff45b6cef74","Type":"ContainerStarted","Data":"deefc66fcf758bb33730c5a763a7bb9ab10d4c44d9b4f2d6d7d8da0dbcc88dd3"} Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.950491 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" gracePeriod=30 Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.951505 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-swhkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.951530 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-swhkn" podUID="1704e345-c317-44b3-89a3-0a7bdd9dd901" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.951595 4921 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5j4lm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.951613 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" podUID="eac88057-b7cd-4264-861c-b7d53340338d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.960622 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcb2z" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.989319 4921 scope.go:117] "RemoveContainer" containerID="bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473" Mar 18 12:12:09 crc kubenswrapper[4921]: I0318 12:12:09.992901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.008731 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.508710475 +0000 UTC m=+150.058631114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.037084 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.038484 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pwwdq"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.045460 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pwwdq"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.047960 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.053555 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qgx2c"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.071927 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.094359 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.095366 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.595351741 +0000 UTC m=+150.145272380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.108203 4921 scope.go:117] "RemoveContainer" containerID="bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.122330 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473\": container with ID starting with bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473 not found: ID does not exist" containerID="bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.122384 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473"} err="failed to get container status \"bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473\": rpc error: code = NotFound desc = could not find container \"bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473\": container with ID starting with bbcb06cc9f7c5220662e21fb6140ea4483b20d815ce21369536390a9fd548473 not found: ID does not exist" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.195926 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.196391 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.696371234 +0000 UTC m=+150.246291913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.297325 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.297514 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.797478309 +0000 UTC m=+150.347398948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.297847 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.298240 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.79822885 +0000 UTC m=+150.348149529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.402654 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.403093 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.903075038 +0000 UTC m=+150.452995677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.427953 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9455c889-9jlf8"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.504001 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.504474 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.004458891 +0000 UTC m=+150.554379530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.513834 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc"] Mar 18 12:12:10 crc kubenswrapper[4921]: W0318 12:12:10.549686 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd72353_b391_4b64_ac21_2d0aa7ff33b3.slice/crio-289db4d430c6d14a7c075251a65f3d32201f42c989941f8d5e27fa8c00a781d1 WatchSource:0}: Error finding container 289db4d430c6d14a7c075251a65f3d32201f42c989941f8d5e27fa8c00a781d1: Status 404 returned error can't find the container with id 289db4d430c6d14a7c075251a65f3d32201f42c989941f8d5e27fa8c00a781d1 Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.605685 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.606007 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.105992408 +0000 UTC m=+150.655913047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.707935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.708403 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.208387619 +0000 UTC m=+150.758308258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.738781 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8gvwb"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.739826 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.745194 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.756869 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gvwb"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.780488 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:10 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:10 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:10 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.780550 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.814754 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.814964 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.314937614 +0000 UTC m=+150.864858253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.815038 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-utilities\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.815066 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtq8n\" (UniqueName: \"kubernetes.io/projected/63c09902-e057-4d3a-811f-e068f2ebe716-kube-api-access-wtq8n\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.815133 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.815229 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-catalog-content\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.815590 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.315577971 +0000 UTC m=+150.865498610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.859248 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.860076 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.864651 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.864882 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.875337 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.918736 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.918977 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-catalog-content\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.919017 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee03574f-fd56-4479-8841-9ab945769f33-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.919042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee03574f-fd56-4479-8841-9ab945769f33-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.919097 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtq8n\" (UniqueName: \"kubernetes.io/projected/63c09902-e057-4d3a-811f-e068f2ebe716-kube-api-access-wtq8n\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.919138 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-utilities\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.919698 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-utilities\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: E0318 12:12:10.919799 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.419781042 +0000 UTC m=+150.969701681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.920086 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-catalog-content\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.951642 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56960: no serving certificate available for the kubelet" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.955633 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4c8k6"] Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.956900 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.961596 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.962441 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtq8n\" (UniqueName: \"kubernetes.io/projected/63c09902-e057-4d3a-811f-e068f2ebe716-kube-api-access-wtq8n\") pod \"certified-operators-8gvwb\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:10 crc kubenswrapper[4921]: I0318 12:12:10.972261 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4c8k6"] Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.003603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" event={"ID":"ddd72353-b391-4b64-ac21-2d0aa7ff33b3","Type":"ContainerStarted","Data":"eaca1be5671522b8736fedb42dc99fedc451902139e025c4c8e9f5c8e6c25dea"} Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.003650 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" event={"ID":"ddd72353-b391-4b64-ac21-2d0aa7ff33b3","Type":"ContainerStarted","Data":"289db4d430c6d14a7c075251a65f3d32201f42c989941f8d5e27fa8c00a781d1"} Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.003668 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.027405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.027742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jft5x\" (UniqueName: \"kubernetes.io/projected/125cafaf-afed-45eb-b6c9-0f06ee2637ec-kube-api-access-jft5x\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.027819 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-catalog-content\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.027836 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-utilities\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.027860 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee03574f-fd56-4479-8841-9ab945769f33-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.027884 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee03574f-fd56-4479-8841-9ab945769f33-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.028350 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.528339622 +0000 UTC m=+151.078260261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.028507 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee03574f-fd56-4479-8841-9ab945769f33-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.038291 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" event={"ID":"1c3038ab-db0f-4b83-8251-e8391578c76c","Type":"ContainerStarted","Data":"96ebdd374dea5ccab21658acf8e67d5cd0bb5beb83cf350584090589506f26f4"} Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.038333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" event={"ID":"1c3038ab-db0f-4b83-8251-e8391578c76c","Type":"ContainerStarted","Data":"322a9e991a16471cf92e63cdf681eca92c7ce1c9271cc59e23b2bb5218ac71fc"} Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.064541 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.103893 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee03574f-fd56-4479-8841-9ab945769f33-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.106029 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" podStartSLOduration=3.106006582 podStartE2EDuration="3.106006582s" podCreationTimestamp="2026-03-18 12:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:11.102989579 +0000 UTC m=+150.652910218" watchObservedRunningTime="2026-03-18 12:12:11.106006582 +0000 UTC m=+150.655927221" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.130488 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.130680 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jft5x\" (UniqueName: \"kubernetes.io/projected/125cafaf-afed-45eb-b6c9-0f06ee2637ec-kube-api-access-jft5x\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.130808 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-catalog-content\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.130839 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-utilities\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.131230 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-utilities\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.131308 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.631291478 +0000 UTC m=+151.181212107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.133450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-catalog-content\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.149962 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-km5sn"] Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.189101 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" event={"ID":"300a483a-7841-4135-ad9f-3ff45b6cef74","Type":"ContainerStarted","Data":"76fcf5d40e84bb6e0bc0469ad1b73898af5410346a154de461e2bb3277006fe5"} Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.189292 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.189720 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.208515 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.209267 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jft5x\" (UniqueName: \"kubernetes.io/projected/125cafaf-afed-45eb-b6c9-0f06ee2637ec-kube-api-access-jft5x\") pod \"community-operators-4c8k6\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.230516 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" podStartSLOduration=3.230491931 podStartE2EDuration="3.230491931s" podCreationTimestamp="2026-03-18 12:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:11.156557744 +0000 UTC m=+150.706478383" watchObservedRunningTime="2026-03-18 12:12:11.230491931 +0000 UTC m=+150.780412570" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.234554 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.234767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws22g\" (UniqueName: \"kubernetes.io/projected/6c23a325-dab9-40a8-bd8b-1f571140cdca-kube-api-access-ws22g\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.234841 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-catalog-content\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.234869 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-utilities\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.238145 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.738108911 +0000 UTC m=+151.288029550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.250993 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98da0c2-cddf-4701-8703-6821fe2bb520" path="/var/lib/kubelet/pods/b98da0c2-cddf-4701-8703-6821fe2bb520/volumes" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.251687 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf" path="/var/lib/kubelet/pods/ccd9ef5a-53e4-4ea4-a8cb-b86fc76febdf/volumes" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.253959 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-km5sn"] Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.290415 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.293732 4921 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.335769 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.336062 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws22g\" (UniqueName: \"kubernetes.io/projected/6c23a325-dab9-40a8-bd8b-1f571140cdca-kube-api-access-ws22g\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.336148 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-catalog-content\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.336178 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-utilities\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.336781 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-utilities\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.336875 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.836852311 +0000 UTC m=+151.386772950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.337518 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-catalog-content\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.342920 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-45chs"] Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.350559 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.355705 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45chs"] Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.378660 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws22g\" (UniqueName: \"kubernetes.io/projected/6c23a325-dab9-40a8-bd8b-1f571140cdca-kube-api-access-ws22g\") pod \"certified-operators-km5sn\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.388771 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.437840 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lj7s\" (UniqueName: \"kubernetes.io/projected/21bdcd14-9430-4bf6-847e-6a31f0efd11a-kube-api-access-4lj7s\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.437895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.437958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-utilities\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.437993 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-catalog-content\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.438348 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:11.938336576 +0000 UTC m=+151.488257215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.539428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.539659 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.039638457 +0000 UTC m=+151.589559096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.539711 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lj7s\" (UniqueName: \"kubernetes.io/projected/21bdcd14-9430-4bf6-847e-6a31f0efd11a-kube-api-access-4lj7s\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.539750 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.539823 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-utilities\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.539871 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-catalog-content\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.540431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-catalog-content\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.541284 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.041270972 +0000 UTC m=+151.591191611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.548637 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.550574 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-utilities\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.597283 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lj7s\" (UniqueName: \"kubernetes.io/projected/21bdcd14-9430-4bf6-847e-6a31f0efd11a-kube-api-access-4lj7s\") pod \"community-operators-45chs\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.641235 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.642081 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.142063208 +0000 UTC m=+151.691983847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.686408 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.744228 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.744585 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.244572652 +0000 UTC m=+151.794493291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.782334 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.810384 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:11 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:11 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:11 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.810443 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.846096 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.846607 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.346589902 +0000 UTC m=+151.896510541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.878898 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gvwb"] Mar 18 12:12:11 crc kubenswrapper[4921]: W0318 12:12:11.899077 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63c09902_e057_4d3a_811f_e068f2ebe716.slice/crio-e0746dd28698e000029ab7a69d72a00590eaa3dd27cf201557599b3d2d7f368c WatchSource:0}: Error finding container e0746dd28698e000029ab7a69d72a00590eaa3dd27cf201557599b3d2d7f368c: Status 404 returned error can't find the container with id e0746dd28698e000029ab7a69d72a00590eaa3dd27cf201557599b3d2d7f368c Mar 18 12:12:11 crc kubenswrapper[4921]: I0318 12:12:11.947908 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:11 crc kubenswrapper[4921]: E0318 12:12:11.948239 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.448227072 +0000 UTC m=+151.998147711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x4bzs" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.011034 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-km5sn"] Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.042391 4921 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T12:12:11.293760554Z","Handler":null,"Name":""} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.046207 4921 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.046266 4921 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.048904 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.053142 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.102146 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4c8k6"] Mar 18 12:12:12 crc kubenswrapper[4921]: W0318 12:12:12.108075 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod125cafaf_afed_45eb_b6c9_0f06ee2637ec.slice/crio-6e25cc082e6276b765f38f441f64809af71c7140d00376690fcedc0c9f75803d WatchSource:0}: Error finding container 6e25cc082e6276b765f38f441f64809af71c7140d00376690fcedc0c9f75803d: Status 404 returned error can't find the container with id 6e25cc082e6276b765f38f441f64809af71c7140d00376690fcedc0c9f75803d Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.150423 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.156994 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.157048 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.162035 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.162081 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.169138 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.193863 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-km5sn" event={"ID":"6c23a325-dab9-40a8-bd8b-1f571140cdca","Type":"ContainerStarted","Data":"d0cb0d358d695c8356775df1a699e0b4e499e900236af3818ea8facc5bd4b728"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.195684 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45chs"] Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.195951 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8k6" event={"ID":"125cafaf-afed-45eb-b6c9-0f06ee2637ec","Type":"ContainerStarted","Data":"6e25cc082e6276b765f38f441f64809af71c7140d00376690fcedc0c9f75803d"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.206989 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" event={"ID":"300a483a-7841-4135-ad9f-3ff45b6cef74","Type":"ContainerStarted","Data":"4e546880aebd559c923453f76b18ec948303bbbffc428e5f846dd6c3ba0dc666"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.207055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" event={"ID":"300a483a-7841-4135-ad9f-3ff45b6cef74","Type":"ContainerStarted","Data":"102a474bb444ab44084f73798aca1cab9d2292c29364e10fb469cebd460e7f09"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.209714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ee03574f-fd56-4479-8841-9ab945769f33","Type":"ContainerStarted","Data":"890aec50ce531e8f2910aa6e6cb7a37850b404af828341551df6ab463c6daf69"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.218538 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x4bzs\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.220371 4921 generic.go:334] "Generic (PLEG): container finished" podID="63c09902-e057-4d3a-811f-e068f2ebe716" containerID="7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055" exitCode=0 Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.221446 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gvwb" event={"ID":"63c09902-e057-4d3a-811f-e068f2ebe716","Type":"ContainerDied","Data":"7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.221494 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gvwb" event={"ID":"63c09902-e057-4d3a-811f-e068f2ebe716","Type":"ContainerStarted","Data":"e0746dd28698e000029ab7a69d72a00590eaa3dd27cf201557599b3d2d7f368c"} Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.222285 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.231695 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dkdbz" podStartSLOduration=12.23166734 podStartE2EDuration="12.23166734s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:12.228792691 +0000 UTC m=+151.778713330" watchObservedRunningTime="2026-03-18 12:12:12.23166734 +0000 UTC m=+151.781587979" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.233361 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zt5mh" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.235637 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.379502 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.380857 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.382061 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.384280 4921 patch_prober.go:28] interesting pod/console-f9d7485db-8s2wr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.384353 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8s2wr" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 18 12:12:12 crc kubenswrapper[4921]: E0318 12:12:12.507126 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c23a325_dab9_40a8_bd8b_1f571140cdca.slice/crio-7207a9b2365c5b33874b7438b0e9cd0b03112141d57ccaf049c3c37db9e7a7d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod125cafaf_afed_45eb_b6c9_0f06ee2637ec.slice/crio-conmon-ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.727436 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dshr9"] Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.729261 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.730946 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.736580 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshr9"] Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.763974 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-utilities\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.764023 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-catalog-content\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.764282 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7vf\" (UniqueName: \"kubernetes.io/projected/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-kube-api-access-tr7vf\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.770185 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.773755 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:12 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:12 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:12 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.773842 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.865741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7vf\" (UniqueName: \"kubernetes.io/projected/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-kube-api-access-tr7vf\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.865851 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-utilities\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.865884 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-catalog-content\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.866979 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-utilities\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.868545 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-catalog-content\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.886871 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7vf\" (UniqueName: \"kubernetes.io/projected/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-kube-api-access-tr7vf\") pod \"redhat-marketplace-dshr9\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:12 crc kubenswrapper[4921]: I0318 12:12:12.938290 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x4bzs"] Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.126851 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.135092 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vz2hx"] Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.148504 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.163506 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz2hx"] Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.169628 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wb6b\" (UniqueName: \"kubernetes.io/projected/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-kube-api-access-7wb6b\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.169748 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-catalog-content\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.169779 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-utilities\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.218386 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.234828 4921 generic.go:334] "Generic (PLEG): container finished" podID="fb744db6-3732-400d-8939-2577d28e7cd5" containerID="65d7dc68c785cfb8dbd858a92ba73968ae8a9c262d45508300c51b2dfc95d3e9" exitCode=0 Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.234914 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" event={"ID":"fb744db6-3732-400d-8939-2577d28e7cd5","Type":"ContainerDied","Data":"65d7dc68c785cfb8dbd858a92ba73968ae8a9c262d45508300c51b2dfc95d3e9"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.237693 4921 generic.go:334] "Generic (PLEG): container finished" podID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerID="a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe" exitCode=0 Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.237778 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45chs" event={"ID":"21bdcd14-9430-4bf6-847e-6a31f0efd11a","Type":"ContainerDied","Data":"a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.237809 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45chs" event={"ID":"21bdcd14-9430-4bf6-847e-6a31f0efd11a","Type":"ContainerStarted","Data":"9b1b74bf97966a75a0db23e0875c8aa088669ca018ff8206d60e3bbb8d8435f4"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.244880 4921 generic.go:334] "Generic (PLEG): container finished" podID="ee03574f-fd56-4479-8841-9ab945769f33" containerID="816cea082d37a795bbe1234b6422bf1874a86938e20746f2ccc391537c9d34bb" exitCode=0 Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.244968 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ee03574f-fd56-4479-8841-9ab945769f33","Type":"ContainerDied","Data":"816cea082d37a795bbe1234b6422bf1874a86938e20746f2ccc391537c9d34bb"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.254427 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" event={"ID":"dbee3eb4-1971-45b8-b0a5-3819407584ec","Type":"ContainerStarted","Data":"57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.254485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" event={"ID":"dbee3eb4-1971-45b8-b0a5-3819407584ec","Type":"ContainerStarted","Data":"617785ee0960af458b79d3685989a134e54459d2cb2478e920c396d073756b77"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.254941 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.257629 4921 generic.go:334] "Generic (PLEG): container finished" podID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerID="7207a9b2365c5b33874b7438b0e9cd0b03112141d57ccaf049c3c37db9e7a7d5" exitCode=0 Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.257674 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-km5sn" event={"ID":"6c23a325-dab9-40a8-bd8b-1f571140cdca","Type":"ContainerDied","Data":"7207a9b2365c5b33874b7438b0e9cd0b03112141d57ccaf049c3c37db9e7a7d5"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.266333 4921 generic.go:334] "Generic (PLEG): container finished" podID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerID="ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0" exitCode=0 Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.266514 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8k6" event={"ID":"125cafaf-afed-45eb-b6c9-0f06ee2637ec","Type":"ContainerDied","Data":"ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0"} Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.273025 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-utilities\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.273455 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-utilities\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.274281 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wb6b\" (UniqueName: \"kubernetes.io/projected/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-kube-api-access-7wb6b\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.274378 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-catalog-content\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.274883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-catalog-content\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.275438 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-swhkn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.275500 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-swhkn" podUID="1704e345-c317-44b3-89a3-0a7bdd9dd901" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.277181 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-swhkn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.277228 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-swhkn" podUID="1704e345-c317-44b3-89a3-0a7bdd9dd901" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.291024 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" podStartSLOduration=95.290991581 podStartE2EDuration="1m35.290991581s" podCreationTimestamp="2026-03-18 12:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:13.288466871 +0000 UTC m=+152.838387530" watchObservedRunningTime="2026-03-18 12:12:13.290991581 +0000 UTC m=+152.840912220" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.311811 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wb6b\" (UniqueName: \"kubernetes.io/projected/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-kube-api-access-7wb6b\") pod \"redhat-marketplace-vz2hx\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.435335 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshr9"] Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.524846 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.545986 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56970: no serving certificate available for the kubelet" Mar 18 12:12:13 crc kubenswrapper[4921]: E0318 12:12:13.552244 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:13 crc kubenswrapper[4921]: E0318 12:12:13.555810 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:13 crc kubenswrapper[4921]: E0318 12:12:13.563939 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:13 crc kubenswrapper[4921]: E0318 12:12:13.564015 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.784655 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:13 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:13 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:13 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.785043 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.932193 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pf9vr"] Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.933161 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.935243 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.944092 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pf9vr"] Mar 18 12:12:13 crc kubenswrapper[4921]: I0318 12:12:13.979362 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz2hx"] Mar 18 12:12:14 crc kubenswrapper[4921]: W0318 12:12:14.005255 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf28d70d_8b0a_4ae1_9747_99e9d42767a6.slice/crio-e14069dd873dbbceb1f28850baffbc20cc43c9517481f4619491ce8046341167 WatchSource:0}: Error finding container e14069dd873dbbceb1f28850baffbc20cc43c9517481f4619491ce8046341167: Status 404 returned error can't find the container with id e14069dd873dbbceb1f28850baffbc20cc43c9517481f4619491ce8046341167 Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.089020 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/09b95848-38ec-4890-9cd2-83bc2e137c4a-kube-api-access-qx4fw\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.089196 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-catalog-content\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.089249 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-utilities\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.192716 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-catalog-content\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.192798 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-utilities\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.192863 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/09b95848-38ec-4890-9cd2-83bc2e137c4a-kube-api-access-qx4fw\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.193688 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-catalog-content\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.193941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-utilities\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.232031 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/09b95848-38ec-4890-9cd2-83bc2e137c4a-kube-api-access-qx4fw\") pod \"redhat-operators-pf9vr\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.254269 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.293883 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.293957 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.293984 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.294014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.299255 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.308985 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.309262 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz2hx" event={"ID":"cf28d70d-8b0a-4ae1-9747-99e9d42767a6","Type":"ContainerStarted","Data":"e14069dd873dbbceb1f28850baffbc20cc43c9517481f4619491ce8046341167"} Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.310015 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.310316 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.311316 4921 generic.go:334] "Generic (PLEG): container finished" podID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerID="448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297" exitCode=0 Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.311613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerDied","Data":"448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297"} Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.311643 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerStarted","Data":"79b4011e68d50e464bb565e35ff3fd2a2f9dc9e241abb66319d3a415a0fdf409"} Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.345959 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hx4pr"] Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.347172 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.365487 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hx4pr"] Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.457595 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.475222 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.484247 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.515214 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-catalog-content\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.515262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6kt\" (UniqueName: \"kubernetes.io/projected/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-kube-api-access-2m6kt\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.515332 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-utilities\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.617255 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-utilities\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.617504 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-catalog-content\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.617554 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6kt\" (UniqueName: \"kubernetes.io/projected/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-kube-api-access-2m6kt\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.617755 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-utilities\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.617785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-catalog-content\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.639002 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6kt\" (UniqueName: \"kubernetes.io/projected/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-kube-api-access-2m6kt\") pod \"redhat-operators-hx4pr\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.686568 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.749838 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.764053 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.777744 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:14 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:14 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:14 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.777809 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.823581 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hxp\" (UniqueName: \"kubernetes.io/projected/fb744db6-3732-400d-8939-2577d28e7cd5-kube-api-access-78hxp\") pod \"fb744db6-3732-400d-8939-2577d28e7cd5\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.823744 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb744db6-3732-400d-8939-2577d28e7cd5-config-volume\") pod \"fb744db6-3732-400d-8939-2577d28e7cd5\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.823793 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb744db6-3732-400d-8939-2577d28e7cd5-secret-volume\") pod \"fb744db6-3732-400d-8939-2577d28e7cd5\" (UID: \"fb744db6-3732-400d-8939-2577d28e7cd5\") " Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.827705 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb744db6-3732-400d-8939-2577d28e7cd5-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb744db6-3732-400d-8939-2577d28e7cd5" (UID: "fb744db6-3732-400d-8939-2577d28e7cd5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.829394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb744db6-3732-400d-8939-2577d28e7cd5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb744db6-3732-400d-8939-2577d28e7cd5" (UID: "fb744db6-3732-400d-8939-2577d28e7cd5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.831641 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb744db6-3732-400d-8939-2577d28e7cd5-kube-api-access-78hxp" (OuterVolumeSpecName: "kube-api-access-78hxp") pod "fb744db6-3732-400d-8939-2577d28e7cd5" (UID: "fb744db6-3732-400d-8939-2577d28e7cd5"). InnerVolumeSpecName "kube-api-access-78hxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.925462 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee03574f-fd56-4479-8841-9ab945769f33-kubelet-dir\") pod \"ee03574f-fd56-4479-8841-9ab945769f33\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.925623 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee03574f-fd56-4479-8841-9ab945769f33-kube-api-access\") pod \"ee03574f-fd56-4479-8841-9ab945769f33\" (UID: \"ee03574f-fd56-4479-8841-9ab945769f33\") " Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.925700 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee03574f-fd56-4479-8841-9ab945769f33-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ee03574f-fd56-4479-8841-9ab945769f33" (UID: "ee03574f-fd56-4479-8841-9ab945769f33"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.926236 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb744db6-3732-400d-8939-2577d28e7cd5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.926267 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78hxp\" (UniqueName: \"kubernetes.io/projected/fb744db6-3732-400d-8939-2577d28e7cd5-kube-api-access-78hxp\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.926282 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee03574f-fd56-4479-8841-9ab945769f33-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.926295 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb744db6-3732-400d-8939-2577d28e7cd5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.935757 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:12:14 crc kubenswrapper[4921]: E0318 12:12:14.936021 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb744db6-3732-400d-8939-2577d28e7cd5" containerName="collect-profiles" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.936043 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb744db6-3732-400d-8939-2577d28e7cd5" containerName="collect-profiles" Mar 18 12:12:14 crc kubenswrapper[4921]: E0318 12:12:14.936055 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee03574f-fd56-4479-8841-9ab945769f33" containerName="pruner" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.936064 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee03574f-fd56-4479-8841-9ab945769f33" containerName="pruner" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.936186 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee03574f-fd56-4479-8841-9ab945769f33" containerName="pruner" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.936198 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb744db6-3732-400d-8939-2577d28e7cd5" containerName="collect-profiles" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.936562 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.949187 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.949407 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.957407 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.963533 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee03574f-fd56-4479-8841-9ab945769f33-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ee03574f-fd56-4479-8841-9ab945769f33" (UID: "ee03574f-fd56-4479-8841-9ab945769f33"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:14 crc kubenswrapper[4921]: I0318 12:12:14.964362 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pf9vr"] Mar 18 12:12:15 crc kubenswrapper[4921]: W0318 12:12:15.018178 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b95848_38ec_4890_9cd2_83bc2e137c4a.slice/crio-889651df45aa4e220583ea0d29f61ed8f626e9a44c22323c32444ca98a71f018 WatchSource:0}: Error finding container 889651df45aa4e220583ea0d29f61ed8f626e9a44c22323c32444ca98a71f018: Status 404 returned error can't find the container with id 889651df45aa4e220583ea0d29f61ed8f626e9a44c22323c32444ca98a71f018 Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.028414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ae80f0-db0e-4503-9362-d18d78c78a02-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.028446 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3ae80f0-db0e-4503-9362-d18d78c78a02-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.028549 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee03574f-fd56-4479-8841-9ab945769f33-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.129405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ae80f0-db0e-4503-9362-d18d78c78a02-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.129786 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3ae80f0-db0e-4503-9362-d18d78c78a02-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.129573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ae80f0-db0e-4503-9362-d18d78c78a02-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.151684 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3ae80f0-db0e-4503-9362-d18d78c78a02-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.240570 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.278372 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.325562 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9dd22e2d3919ecca41c77f1b0decd24edf9e3c80f8bb0e5e4016b07871c4acd7"} Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.343759 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hx4pr"] Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.346497 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" event={"ID":"fb744db6-3732-400d-8939-2577d28e7cd5","Type":"ContainerDied","Data":"b7bad6ac6f1cc09ed86330d5f24c802daace8c3d1bd7cbaa735c72420428e189"} Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.346538 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bad6ac6f1cc09ed86330d5f24c802daace8c3d1bd7cbaa735c72420428e189" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.346620 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.392624 4921 generic.go:334] "Generic (PLEG): container finished" podID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerID="6e24826f8a7d51093861a8d9766550ff5209df698a121eb3d2b6c38bd3679545" exitCode=0 Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.392731 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz2hx" event={"ID":"cf28d70d-8b0a-4ae1-9747-99e9d42767a6","Type":"ContainerDied","Data":"6e24826f8a7d51093861a8d9766550ff5209df698a121eb3d2b6c38bd3679545"} Mar 18 12:12:15 crc kubenswrapper[4921]: W0318 12:12:15.412197 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9350ca3c_fa40_4169_87f2_06ac9d6c16bf.slice/crio-ae2f10a1ecd7c55ba6b585b74ce63726a652ff3799e3185f6eef613d94a604c9 WatchSource:0}: Error finding container ae2f10a1ecd7c55ba6b585b74ce63726a652ff3799e3185f6eef613d94a604c9: Status 404 returned error can't find the container with id ae2f10a1ecd7c55ba6b585b74ce63726a652ff3799e3185f6eef613d94a604c9 Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.442213 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.442196129 podStartE2EDuration="442.196129ms" podCreationTimestamp="2026-03-18 12:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:15.435266738 +0000 UTC m=+154.985187377" watchObservedRunningTime="2026-03-18 12:12:15.442196129 +0000 UTC m=+154.992116768" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.454979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf9vr" event={"ID":"09b95848-38ec-4890-9cd2-83bc2e137c4a","Type":"ContainerStarted","Data":"889651df45aa4e220583ea0d29f61ed8f626e9a44c22323c32444ca98a71f018"} Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.518709 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bb415408c8c21f49a859821749e97a2a2df1bd568f2fcba511b09f2804fb61ba"} Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.536356 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.536677 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ee03574f-fd56-4479-8841-9ab945769f33","Type":"ContainerDied","Data":"890aec50ce531e8f2910aa6e6cb7a37850b404af828341551df6ab463c6daf69"} Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.536698 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="890aec50ce531e8f2910aa6e6cb7a37850b404af828341551df6ab463c6daf69" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.558499 4921 ???:1] "http: TLS handshake error from 192.168.126.11:56978: no serving certificate available for the kubelet" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.780790 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:15 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:15 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:15 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.781210 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:15 crc kubenswrapper[4921]: I0318 12:12:15.946709 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.548622 4921 generic.go:334] "Generic (PLEG): container finished" podID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerID="80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50" exitCode=0 Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.549374 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf9vr" event={"ID":"09b95848-38ec-4890-9cd2-83bc2e137c4a","Type":"ContainerDied","Data":"80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.553410 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f3ae80f0-db0e-4503-9362-d18d78c78a02","Type":"ContainerStarted","Data":"1b893d43b4785ee5d69a9d13a47641f608bda30ccd0962211791358b79734b09"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.555826 4921 generic.go:334] "Generic (PLEG): container finished" podID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerID="9abd892cccae12932d757e6c9f61b16311ada66f1adb273fcb23ed70fcb56f62" exitCode=0 Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.555864 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hx4pr" event={"ID":"9350ca3c-fa40-4169-87f2-06ac9d6c16bf","Type":"ContainerDied","Data":"9abd892cccae12932d757e6c9f61b16311ada66f1adb273fcb23ed70fcb56f62"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.555882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hx4pr" event={"ID":"9350ca3c-fa40-4169-87f2-06ac9d6c16bf","Type":"ContainerStarted","Data":"ae2f10a1ecd7c55ba6b585b74ce63726a652ff3799e3185f6eef613d94a604c9"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.561659 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e4c89e73b6679d9c860379271e08c188dec7c1d770b704807b722877703fc9bf"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.571285 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f41dc1d1755f342ba872bc726ba201b4ab2625052e6fdac67ca360cc0ec1d9ff"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.572039 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.588485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7b0b1b082bfcf5eca95ffa9f4d0c2c7c07b836270002f6a048b7cd8b6a0cea7d"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.588536 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a35b88c318a84c3a8b6d8d6156f7b54ee3a1a039c902b1c4e6abad6275c94c42"} Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.783972 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:16 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:16 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:16 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:16 crc kubenswrapper[4921]: I0318 12:12:16.784144 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:17 crc kubenswrapper[4921]: I0318 12:12:17.530185 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:12:17 crc kubenswrapper[4921]: I0318 12:12:17.615086 4921 generic.go:334] "Generic (PLEG): container finished" podID="f3ae80f0-db0e-4503-9362-d18d78c78a02" containerID="d98a97a4c35e0b6712565323439251deff90f54bc456fe6bb2996131c1c67d9f" exitCode=0 Mar 18 12:12:17 crc kubenswrapper[4921]: I0318 12:12:17.616238 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f3ae80f0-db0e-4503-9362-d18d78c78a02","Type":"ContainerDied","Data":"d98a97a4c35e0b6712565323439251deff90f54bc456fe6bb2996131c1c67d9f"} Mar 18 12:12:17 crc kubenswrapper[4921]: I0318 12:12:17.773708 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:17 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:17 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:17 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:17 crc kubenswrapper[4921]: I0318 12:12:17.773859 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:18 crc kubenswrapper[4921]: I0318 12:12:18.587492 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-747dt" Mar 18 12:12:18 crc kubenswrapper[4921]: I0318 12:12:18.701504 4921 ???:1] "http: TLS handshake error from 192.168.126.11:38394: no serving certificate available for the kubelet" Mar 18 12:12:18 crc kubenswrapper[4921]: I0318 12:12:18.776828 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:18 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:18 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:18 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:18 crc kubenswrapper[4921]: I0318 12:12:18.777076 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:19 crc kubenswrapper[4921]: I0318 12:12:19.254468 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 12:12:19 crc kubenswrapper[4921]: I0318 12:12:19.773530 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:19 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 18 12:12:19 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:19 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:19 crc kubenswrapper[4921]: I0318 12:12:19.773591 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:20 crc kubenswrapper[4921]: I0318 12:12:20.772254 4921 patch_prober.go:28] interesting pod/router-default-5444994796-52p2z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:12:20 crc kubenswrapper[4921]: [+]has-synced ok Mar 18 12:12:20 crc kubenswrapper[4921]: [+]process-running ok Mar 18 12:12:20 crc kubenswrapper[4921]: healthz check failed Mar 18 12:12:20 crc kubenswrapper[4921]: I0318 12:12:20.772329 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52p2z" podUID="356738b5-d7cc-4ce5-9bd8-1a45bf7630a5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:12:21 crc kubenswrapper[4921]: I0318 12:12:21.229009 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.228994064 podStartE2EDuration="2.228994064s" podCreationTimestamp="2026-03-18 12:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:21.228442729 +0000 UTC m=+160.778363428" watchObservedRunningTime="2026-03-18 12:12:21.228994064 +0000 UTC m=+160.778914703" Mar 18 12:12:21 crc kubenswrapper[4921]: I0318 12:12:21.774820 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:21 crc kubenswrapper[4921]: I0318 12:12:21.779414 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-52p2z" Mar 18 12:12:22 crc kubenswrapper[4921]: I0318 12:12:22.381829 4921 patch_prober.go:28] interesting pod/console-f9d7485db-8s2wr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 18 12:12:22 crc kubenswrapper[4921]: I0318 12:12:22.381921 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8s2wr" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 18 12:12:23 crc kubenswrapper[4921]: I0318 12:12:23.307320 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-swhkn" Mar 18 12:12:23 crc kubenswrapper[4921]: E0318 12:12:23.537465 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:23 crc kubenswrapper[4921]: E0318 12:12:23.567511 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:23 crc kubenswrapper[4921]: E0318 12:12:23.570047 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:23 crc kubenswrapper[4921]: E0318 12:12:23.570131 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:27 crc kubenswrapper[4921]: I0318 12:12:27.269450 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9455c889-9jlf8"] Mar 18 12:12:27 crc kubenswrapper[4921]: I0318 12:12:27.270495 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" podUID="1c3038ab-db0f-4b83-8251-e8391578c76c" containerName="controller-manager" containerID="cri-o://96ebdd374dea5ccab21658acf8e67d5cd0bb5beb83cf350584090589506f26f4" gracePeriod=30 Mar 18 12:12:27 crc kubenswrapper[4921]: I0318 12:12:27.298718 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc"] Mar 18 12:12:27 crc kubenswrapper[4921]: I0318 12:12:27.299239 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" podUID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" containerName="route-controller-manager" containerID="cri-o://eaca1be5671522b8736fedb42dc99fedc451902139e025c4c8e9f5c8e6c25dea" gracePeriod=30 Mar 18 12:12:27 crc kubenswrapper[4921]: I0318 12:12:27.741481 4921 generic.go:334] "Generic (PLEG): container finished" podID="1c3038ab-db0f-4b83-8251-e8391578c76c" containerID="96ebdd374dea5ccab21658acf8e67d5cd0bb5beb83cf350584090589506f26f4" exitCode=0 Mar 18 12:12:27 crc kubenswrapper[4921]: I0318 12:12:27.741530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" event={"ID":"1c3038ab-db0f-4b83-8251-e8391578c76c","Type":"ContainerDied","Data":"96ebdd374dea5ccab21658acf8e67d5cd0bb5beb83cf350584090589506f26f4"} Mar 18 12:12:28 crc kubenswrapper[4921]: I0318 12:12:28.751012 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" containerID="eaca1be5671522b8736fedb42dc99fedc451902139e025c4c8e9f5c8e6c25dea" exitCode=0 Mar 18 12:12:28 crc kubenswrapper[4921]: I0318 12:12:28.751097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" event={"ID":"ddd72353-b391-4b64-ac21-2d0aa7ff33b3","Type":"ContainerDied","Data":"eaca1be5671522b8736fedb42dc99fedc451902139e025c4c8e9f5c8e6c25dea"} Mar 18 12:12:30 crc kubenswrapper[4921]: I0318 12:12:30.038077 4921 patch_prober.go:28] interesting pod/route-controller-manager-7885789c5f-c7fhc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 18 12:12:30 crc kubenswrapper[4921]: I0318 12:12:30.038240 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" podUID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 18 12:12:30 crc kubenswrapper[4921]: I0318 12:12:30.073004 4921 patch_prober.go:28] interesting pod/controller-manager-7f9455c889-9jlf8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 18 12:12:30 crc kubenswrapper[4921]: I0318 12:12:30.073072 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" podUID="1c3038ab-db0f-4b83-8251-e8391578c76c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.386732 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.414541 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.422633 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.452612 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.564839 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3ae80f0-db0e-4503-9362-d18d78c78a02-kube-api-access\") pod \"f3ae80f0-db0e-4503-9362-d18d78c78a02\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.564925 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ae80f0-db0e-4503-9362-d18d78c78a02-kubelet-dir\") pod \"f3ae80f0-db0e-4503-9362-d18d78c78a02\" (UID: \"f3ae80f0-db0e-4503-9362-d18d78c78a02\") " Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.564974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3ae80f0-db0e-4503-9362-d18d78c78a02-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f3ae80f0-db0e-4503-9362-d18d78c78a02" (UID: "f3ae80f0-db0e-4503-9362-d18d78c78a02"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.565277 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3ae80f0-db0e-4503-9362-d18d78c78a02-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.570639 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ae80f0-db0e-4503-9362-d18d78c78a02-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f3ae80f0-db0e-4503-9362-d18d78c78a02" (UID: "f3ae80f0-db0e-4503-9362-d18d78c78a02"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.665959 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3ae80f0-db0e-4503-9362-d18d78c78a02-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.773677 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.773689 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f3ae80f0-db0e-4503-9362-d18d78c78a02","Type":"ContainerDied","Data":"1b893d43b4785ee5d69a9d13a47641f608bda30ccd0962211791358b79734b09"} Mar 18 12:12:32 crc kubenswrapper[4921]: I0318 12:12:32.773908 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b893d43b4785ee5d69a9d13a47641f608bda30ccd0962211791358b79734b09" Mar 18 12:12:33 crc kubenswrapper[4921]: E0318 12:12:33.537537 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:33 crc kubenswrapper[4921]: E0318 12:12:33.540598 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:33 crc kubenswrapper[4921]: E0318 12:12:33.541700 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:33 crc kubenswrapper[4921]: E0318 12:12:33.541782 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:35 crc kubenswrapper[4921]: E0318 12:12:35.949707 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 12:12:35 crc kubenswrapper[4921]: E0318 12:12:35.950221 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:12:35 crc kubenswrapper[4921]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 12:12:35 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563932-b8gp7_openshift-infra(58d7472e-2ed9-434f-b1f0-6147f9452a11): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 12:12:35 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 18 12:12:35 crc kubenswrapper[4921]: E0318 12:12:35.951505 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" podUID="58d7472e-2ed9-434f-b1f0-6147f9452a11" Mar 18 12:12:36 crc kubenswrapper[4921]: E0318 12:12:36.795593 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" podUID="58d7472e-2ed9-434f-b1f0-6147f9452a11" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.939132 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.946724 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986077 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j"] Mar 18 12:12:38 crc kubenswrapper[4921]: E0318 12:12:38.986403 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ae80f0-db0e-4503-9362-d18d78c78a02" containerName="pruner" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986419 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ae80f0-db0e-4503-9362-d18d78c78a02" containerName="pruner" Mar 18 12:12:38 crc kubenswrapper[4921]: E0318 12:12:38.986438 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" containerName="route-controller-manager" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986446 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" containerName="route-controller-manager" Mar 18 12:12:38 crc kubenswrapper[4921]: E0318 12:12:38.986461 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3038ab-db0f-4b83-8251-e8391578c76c" containerName="controller-manager" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986470 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3038ab-db0f-4b83-8251-e8391578c76c" containerName="controller-manager" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3038ab-db0f-4b83-8251-e8391578c76c" containerName="controller-manager" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986607 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ae80f0-db0e-4503-9362-d18d78c78a02" containerName="pruner" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.986619 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" containerName="route-controller-manager" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.987166 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:38 crc kubenswrapper[4921]: I0318 12:12:38.989756 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j"] Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051749 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-config\") pod \"1c3038ab-db0f-4b83-8251-e8391578c76c\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051798 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-client-ca\") pod \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051821 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk42j\" (UniqueName: \"kubernetes.io/projected/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-kube-api-access-pk42j\") pod \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2x7\" (UniqueName: \"kubernetes.io/projected/1c3038ab-db0f-4b83-8251-e8391578c76c-kube-api-access-xj2x7\") pod \"1c3038ab-db0f-4b83-8251-e8391578c76c\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051909 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3038ab-db0f-4b83-8251-e8391578c76c-serving-cert\") pod \"1c3038ab-db0f-4b83-8251-e8391578c76c\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051933 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-client-ca\") pod \"1c3038ab-db0f-4b83-8251-e8391578c76c\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.051984 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-serving-cert\") pod \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.052030 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-proxy-ca-bundles\") pod \"1c3038ab-db0f-4b83-8251-e8391578c76c\" (UID: \"1c3038ab-db0f-4b83-8251-e8391578c76c\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.052052 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-config\") pod \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\" (UID: \"ddd72353-b391-4b64-ac21-2d0aa7ff33b3\") " Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.052558 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "ddd72353-b391-4b64-ac21-2d0aa7ff33b3" (UID: "ddd72353-b391-4b64-ac21-2d0aa7ff33b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.052670 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-config" (OuterVolumeSpecName: "config") pod "ddd72353-b391-4b64-ac21-2d0aa7ff33b3" (UID: "ddd72353-b391-4b64-ac21-2d0aa7ff33b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.053255 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1c3038ab-db0f-4b83-8251-e8391578c76c" (UID: "1c3038ab-db0f-4b83-8251-e8391578c76c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.053362 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c3038ab-db0f-4b83-8251-e8391578c76c" (UID: "1c3038ab-db0f-4b83-8251-e8391578c76c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.054478 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-config" (OuterVolumeSpecName: "config") pod "1c3038ab-db0f-4b83-8251-e8391578c76c" (UID: "1c3038ab-db0f-4b83-8251-e8391578c76c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.058227 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-kube-api-access-pk42j" (OuterVolumeSpecName: "kube-api-access-pk42j") pod "ddd72353-b391-4b64-ac21-2d0aa7ff33b3" (UID: "ddd72353-b391-4b64-ac21-2d0aa7ff33b3"). InnerVolumeSpecName "kube-api-access-pk42j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.058310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3038ab-db0f-4b83-8251-e8391578c76c-kube-api-access-xj2x7" (OuterVolumeSpecName: "kube-api-access-xj2x7") pod "1c3038ab-db0f-4b83-8251-e8391578c76c" (UID: "1c3038ab-db0f-4b83-8251-e8391578c76c"). InnerVolumeSpecName "kube-api-access-xj2x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.059732 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ddd72353-b391-4b64-ac21-2d0aa7ff33b3" (UID: "ddd72353-b391-4b64-ac21-2d0aa7ff33b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.073661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3038ab-db0f-4b83-8251-e8391578c76c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c3038ab-db0f-4b83-8251-e8391578c76c" (UID: "1c3038ab-db0f-4b83-8251-e8391578c76c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.153599 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-client-ca\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.153913 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e79610be-b2bf-4f52-b9a1-227bad68d5bf-serving-cert\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154038 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kclh\" (UniqueName: \"kubernetes.io/projected/e79610be-b2bf-4f52-b9a1-227bad68d5bf-kube-api-access-6kclh\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154218 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-config\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154373 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154460 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154540 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk42j\" (UniqueName: \"kubernetes.io/projected/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-kube-api-access-pk42j\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154623 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj2x7\" (UniqueName: \"kubernetes.io/projected/1c3038ab-db0f-4b83-8251-e8391578c76c-kube-api-access-xj2x7\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154738 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c3038ab-db0f-4b83-8251-e8391578c76c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154822 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154903 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.154976 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c3038ab-db0f-4b83-8251-e8391578c76c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.155055 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd72353-b391-4b64-ac21-2d0aa7ff33b3-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.201069 4921 ???:1] "http: TLS handshake error from 192.168.126.11:57864: no serving certificate available for the kubelet" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.255550 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kclh\" (UniqueName: \"kubernetes.io/projected/e79610be-b2bf-4f52-b9a1-227bad68d5bf-kube-api-access-6kclh\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.255627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-config\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.255664 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-client-ca\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.255679 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e79610be-b2bf-4f52-b9a1-227bad68d5bf-serving-cert\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.257051 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-client-ca\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.257248 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-config\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.271093 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e79610be-b2bf-4f52-b9a1-227bad68d5bf-serving-cert\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.272673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kclh\" (UniqueName: \"kubernetes.io/projected/e79610be-b2bf-4f52-b9a1-227bad68d5bf-kube-api-access-6kclh\") pod \"route-controller-manager-65b55468db-8b72j\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.312366 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.809749 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" event={"ID":"1c3038ab-db0f-4b83-8251-e8391578c76c","Type":"ContainerDied","Data":"322a9e991a16471cf92e63cdf681eca92c7ce1c9271cc59e23b2bb5218ac71fc"} Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.809823 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9455c889-9jlf8" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.810205 4921 scope.go:117] "RemoveContainer" containerID="96ebdd374dea5ccab21658acf8e67d5cd0bb5beb83cf350584090589506f26f4" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.812383 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" event={"ID":"ddd72353-b391-4b64-ac21-2d0aa7ff33b3","Type":"ContainerDied","Data":"289db4d430c6d14a7c075251a65f3d32201f42c989941f8d5e27fa8c00a781d1"} Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.812446 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc" Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.832939 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9455c889-9jlf8"] Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.833229 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f9455c889-9jlf8"] Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.842619 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc"] Mar 18 12:12:39 crc kubenswrapper[4921]: I0318 12:12:39.847664 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7885789c5f-c7fhc"] Mar 18 12:12:40 crc kubenswrapper[4921]: E0318 12:12:40.669646 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 12:12:40 crc kubenswrapper[4921]: E0318 12:12:40.669950 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws22g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-km5sn_openshift-marketplace(6c23a325-dab9-40a8-bd8b-1f571140cdca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:12:40 crc kubenswrapper[4921]: E0318 12:12:40.671148 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-km5sn" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" Mar 18 12:12:40 crc kubenswrapper[4921]: I0318 12:12:40.819093 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5pf85_3e4be723-81e1-4c74-a380-3ccd634a2f39/kube-multus-additional-cni-plugins/0.log" Mar 18 12:12:40 crc kubenswrapper[4921]: I0318 12:12:40.819165 4921 generic.go:334] "Generic (PLEG): container finished" podID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" exitCode=137 Mar 18 12:12:40 crc kubenswrapper[4921]: I0318 12:12:40.819254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" event={"ID":"3e4be723-81e1-4c74-a380-3ccd634a2f39","Type":"ContainerDied","Data":"74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830"} Mar 18 12:12:41 crc kubenswrapper[4921]: I0318 12:12:41.223594 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c3038ab-db0f-4b83-8251-e8391578c76c" path="/var/lib/kubelet/pods/1c3038ab-db0f-4b83-8251-e8391578c76c/volumes" Mar 18 12:12:41 crc kubenswrapper[4921]: I0318 12:12:41.227452 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd72353-b391-4b64-ac21-2d0aa7ff33b3" path="/var/lib/kubelet/pods/ddd72353-b391-4b64-ac21-2d0aa7ff33b3/volumes" Mar 18 12:12:42 crc kubenswrapper[4921]: E0318 12:12:42.077801 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-km5sn" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" Mar 18 12:12:42 crc kubenswrapper[4921]: E0318 12:12:42.231226 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 12:12:42 crc kubenswrapper[4921]: E0318 12:12:42.231421 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wtq8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8gvwb_openshift-marketplace(63c09902-e057-4d3a-811f-e068f2ebe716): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:12:42 crc kubenswrapper[4921]: E0318 12:12:42.232613 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8gvwb" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" Mar 18 12:12:42 crc kubenswrapper[4921]: I0318 12:12:42.898926 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g8jgv" Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.536282 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830 is running failed: container process not found" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.537136 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830 is running failed: container process not found" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.537648 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830 is running failed: container process not found" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.537678 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.608018 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8gvwb" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.724244 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.724493 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tr7vf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dshr9_openshift-marketplace(b49f7bf4-ce72-4b66-8d0e-b2061d228a58): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.730760 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7857fc67d9-68r27"] Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.731716 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:43 crc kubenswrapper[4921]: E0318 12:12:43.735067 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dshr9" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.750571 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.751848 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.753577 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.764980 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.765181 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.765316 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.765940 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7857fc67d9-68r27"] Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.767360 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.920213 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-client-ca\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.920286 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-proxy-ca-bundles\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.920326 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-serving-cert\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.920404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-config\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:43 crc kubenswrapper[4921]: I0318 12:12:43.920430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgjl\" (UniqueName: \"kubernetes.io/projected/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-kube-api-access-zkgjl\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.021602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-client-ca\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.021657 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-proxy-ca-bundles\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.021681 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-serving-cert\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.021741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkgjl\" (UniqueName: \"kubernetes.io/projected/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-kube-api-access-zkgjl\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.021759 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-config\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.022830 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-client-ca\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.023540 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-proxy-ca-bundles\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.023874 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-config\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.029046 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-serving-cert\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.039558 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkgjl\" (UniqueName: \"kubernetes.io/projected/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-kube-api-access-zkgjl\") pod \"controller-manager-7857fc67d9-68r27\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:44 crc kubenswrapper[4921]: I0318 12:12:44.079588 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.126960 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.128427 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.132319 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.132593 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.134892 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.140624 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.140936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.241867 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.241944 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.242018 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.271742 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:45 crc kubenswrapper[4921]: I0318 12:12:45.464623 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.267231 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7857fc67d9-68r27"] Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.373767 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j"] Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.637239 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dshr9" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.677968 4921 scope.go:117] "RemoveContainer" containerID="eaca1be5671522b8736fedb42dc99fedc451902139e025c4c8e9f5c8e6c25dea" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.756324 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5pf85_3e4be723-81e1-4c74-a380-3ccd634a2f39/kube-multus-additional-cni-plugins/0.log" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.756403 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.788967 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.789404 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m6kt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hx4pr_openshift-marketplace(9350ca3c-fa40-4169-87f2-06ac9d6c16bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.790667 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hx4pr" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.790880 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.791028 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4lj7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-45chs_openshift-marketplace(21bdcd14-9430-4bf6-847e-6a31f0efd11a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.792525 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-45chs" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.797741 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.797854 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qx4fw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pf9vr_openshift-marketplace(09b95848-38ec-4890-9cd2-83bc2e137c4a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.799216 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pf9vr" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.863497 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5pf85_3e4be723-81e1-4c74-a380-3ccd634a2f39/kube-multus-additional-cni-plugins/0.log" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.863603 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.863595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5pf85" event={"ID":"3e4be723-81e1-4c74-a380-3ccd634a2f39","Type":"ContainerDied","Data":"41e3dcb0a6c0ea970ec9a443607fd2e3688b37c7d3a685244d9f111ed5963329"} Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.863742 4921 scope.go:117] "RemoveContainer" containerID="74babcd1acf4c961abfd60e891cb4157123fa424e49512c05d3d8158146de830" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.875363 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pf9vr" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.884138 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e4be723-81e1-4c74-a380-3ccd634a2f39-tuning-conf-dir\") pod \"3e4be723-81e1-4c74-a380-3ccd634a2f39\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.884183 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e4be723-81e1-4c74-a380-3ccd634a2f39-ready\") pod \"3e4be723-81e1-4c74-a380-3ccd634a2f39\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.884251 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e4be723-81e1-4c74-a380-3ccd634a2f39-cni-sysctl-allowlist\") pod \"3e4be723-81e1-4c74-a380-3ccd634a2f39\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.884313 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb4br\" (UniqueName: \"kubernetes.io/projected/3e4be723-81e1-4c74-a380-3ccd634a2f39-kube-api-access-hb4br\") pod \"3e4be723-81e1-4c74-a380-3ccd634a2f39\" (UID: \"3e4be723-81e1-4c74-a380-3ccd634a2f39\") " Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.884916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e4be723-81e1-4c74-a380-3ccd634a2f39-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "3e4be723-81e1-4c74-a380-3ccd634a2f39" (UID: "3e4be723-81e1-4c74-a380-3ccd634a2f39"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.885448 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4be723-81e1-4c74-a380-3ccd634a2f39-ready" (OuterVolumeSpecName: "ready") pod "3e4be723-81e1-4c74-a380-3ccd634a2f39" (UID: "3e4be723-81e1-4c74-a380-3ccd634a2f39"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.885654 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4be723-81e1-4c74-a380-3ccd634a2f39-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "3e4be723-81e1-4c74-a380-3ccd634a2f39" (UID: "3e4be723-81e1-4c74-a380-3ccd634a2f39"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.886620 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hx4pr" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" Mar 18 12:12:47 crc kubenswrapper[4921]: E0318 12:12:47.887488 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-45chs" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.897655 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4be723-81e1-4c74-a380-3ccd634a2f39-kube-api-access-hb4br" (OuterVolumeSpecName: "kube-api-access-hb4br") pod "3e4be723-81e1-4c74-a380-3ccd634a2f39" (UID: "3e4be723-81e1-4c74-a380-3ccd634a2f39"). InnerVolumeSpecName "kube-api-access-hb4br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.988439 4921 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e4be723-81e1-4c74-a380-3ccd634a2f39-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.988482 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb4br\" (UniqueName: \"kubernetes.io/projected/3e4be723-81e1-4c74-a380-3ccd634a2f39-kube-api-access-hb4br\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.988494 4921 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e4be723-81e1-4c74-a380-3ccd634a2f39-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:47 crc kubenswrapper[4921]: I0318 12:12:47.988508 4921 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e4be723-81e1-4c74-a380-3ccd634a2f39-ready\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.033841 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:12:48 crc kubenswrapper[4921]: W0318 12:12:48.045910 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5825bf61_f4e1_4ee4_8d23_a66809b454b5.slice/crio-a29a34d2951a6af83db0bb37848a6426d6c9a2e83a67e79b3de1b792f08586f1 WatchSource:0}: Error finding container a29a34d2951a6af83db0bb37848a6426d6c9a2e83a67e79b3de1b792f08586f1: Status 404 returned error can't find the container with id a29a34d2951a6af83db0bb37848a6426d6c9a2e83a67e79b3de1b792f08586f1 Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.154726 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7857fc67d9-68r27"] Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.175002 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j"] Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.196798 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5pf85"] Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.199808 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5pf85"] Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.878386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5825bf61-f4e1-4ee4-8d23-a66809b454b5","Type":"ContainerStarted","Data":"cc04f41fd604a15a3eddf6c7dcd7f49713fa5b7c4ee283263a249ec83734b358"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.878750 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5825bf61-f4e1-4ee4-8d23-a66809b454b5","Type":"ContainerStarted","Data":"a29a34d2951a6af83db0bb37848a6426d6c9a2e83a67e79b3de1b792f08586f1"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.879799 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" event={"ID":"e79610be-b2bf-4f52-b9a1-227bad68d5bf","Type":"ContainerStarted","Data":"a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.879839 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" event={"ID":"e79610be-b2bf-4f52-b9a1-227bad68d5bf","Type":"ContainerStarted","Data":"a5351ca1638512128b8230ec47a0050c85cf4b054341d535a40011522431827d"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.879850 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" podUID="e79610be-b2bf-4f52-b9a1-227bad68d5bf" containerName="route-controller-manager" containerID="cri-o://a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b" gracePeriod=30 Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.880028 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.883663 4921 generic.go:334] "Generic (PLEG): container finished" podID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerID="5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e" exitCode=0 Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.883733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8k6" event={"ID":"125cafaf-afed-45eb-b6c9-0f06ee2637ec","Type":"ContainerDied","Data":"5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.888259 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.891308 4921 generic.go:334] "Generic (PLEG): container finished" podID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerID="d33e3016ab913cf8f8ceea02f83e6859a303fe76b7921b99bb8fd48cecba9df1" exitCode=0 Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.891361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz2hx" event={"ID":"cf28d70d-8b0a-4ae1-9747-99e9d42767a6","Type":"ContainerDied","Data":"d33e3016ab913cf8f8ceea02f83e6859a303fe76b7921b99bb8fd48cecba9df1"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.894183 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" event={"ID":"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d","Type":"ContainerStarted","Data":"44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.894212 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" event={"ID":"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d","Type":"ContainerStarted","Data":"b26f743dfa55532d421ccf7c513b6882f654e377346dc075b7c0420e9ccdbd69"} Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.894283 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" podUID="ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" containerName="controller-manager" containerID="cri-o://44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c" gracePeriod=30 Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.894582 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.900825 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.900807165 podStartE2EDuration="3.900807165s" podCreationTimestamp="2026-03-18 12:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:48.900355922 +0000 UTC m=+188.450276561" watchObservedRunningTime="2026-03-18 12:12:48.900807165 +0000 UTC m=+188.450727794" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.901303 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.960249 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" podStartSLOduration=21.960227481 podStartE2EDuration="21.960227481s" podCreationTimestamp="2026-03-18 12:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:48.943079189 +0000 UTC m=+188.492999848" watchObservedRunningTime="2026-03-18 12:12:48.960227481 +0000 UTC m=+188.510148120" Mar 18 12:12:48 crc kubenswrapper[4921]: I0318 12:12:48.987036 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" podStartSLOduration=21.987009699 podStartE2EDuration="21.987009699s" podCreationTimestamp="2026-03-18 12:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:48.983510503 +0000 UTC m=+188.533431162" watchObservedRunningTime="2026-03-18 12:12:48.987009699 +0000 UTC m=+188.536930338" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.216236 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" path="/var/lib/kubelet/pods/3e4be723-81e1-4c74-a380-3ccd634a2f39/volumes" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.221549 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.227702 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.419875 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-serving-cert\") pod \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.420450 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-client-ca\") pod \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.420532 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e79610be-b2bf-4f52-b9a1-227bad68d5bf-serving-cert\") pod \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.420627 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-proxy-ca-bundles\") pod \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.420689 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkgjl\" (UniqueName: \"kubernetes.io/projected/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-kube-api-access-zkgjl\") pod \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.420726 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-config\") pod \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.420749 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kclh\" (UniqueName: \"kubernetes.io/projected/e79610be-b2bf-4f52-b9a1-227bad68d5bf-kube-api-access-6kclh\") pod \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.421062 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-client-ca" (OuterVolumeSpecName: "client-ca") pod "e79610be-b2bf-4f52-b9a1-227bad68d5bf" (UID: "e79610be-b2bf-4f52-b9a1-227bad68d5bf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.421770 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-config" (OuterVolumeSpecName: "config") pod "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" (UID: "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.421786 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" (UID: "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.421841 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-client-ca\") pod \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\" (UID: \"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.421877 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-config\") pod \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\" (UID: \"e79610be-b2bf-4f52-b9a1-227bad68d5bf\") " Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.422069 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.422087 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.422104 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.422490 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-client-ca" (OuterVolumeSpecName: "client-ca") pod "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" (UID: "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.422712 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-config" (OuterVolumeSpecName: "config") pod "e79610be-b2bf-4f52-b9a1-227bad68d5bf" (UID: "e79610be-b2bf-4f52-b9a1-227bad68d5bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.426592 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79610be-b2bf-4f52-b9a1-227bad68d5bf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e79610be-b2bf-4f52-b9a1-227bad68d5bf" (UID: "e79610be-b2bf-4f52-b9a1-227bad68d5bf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.427260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-kube-api-access-zkgjl" (OuterVolumeSpecName: "kube-api-access-zkgjl") pod "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" (UID: "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d"). InnerVolumeSpecName "kube-api-access-zkgjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.427836 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e79610be-b2bf-4f52-b9a1-227bad68d5bf-kube-api-access-6kclh" (OuterVolumeSpecName: "kube-api-access-6kclh") pod "e79610be-b2bf-4f52-b9a1-227bad68d5bf" (UID: "e79610be-b2bf-4f52-b9a1-227bad68d5bf"). InnerVolumeSpecName "kube-api-access-6kclh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.435650 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" (UID: "ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.523929 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkgjl\" (UniqueName: \"kubernetes.io/projected/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-kube-api-access-zkgjl\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.523977 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kclh\" (UniqueName: \"kubernetes.io/projected/e79610be-b2bf-4f52-b9a1-227bad68d5bf-kube-api-access-6kclh\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.523991 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.524006 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e79610be-b2bf-4f52-b9a1-227bad68d5bf-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.524018 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.524030 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e79610be-b2bf-4f52-b9a1-227bad68d5bf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.901632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz2hx" event={"ID":"cf28d70d-8b0a-4ae1-9747-99e9d42767a6","Type":"ContainerStarted","Data":"cb98cef3fe2736ceb26d23c1a17bc20eb06bc1ab780ff27ec9e86f801b63ac58"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.903394 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" containerID="44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c" exitCode=0 Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.903458 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" event={"ID":"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d","Type":"ContainerDied","Data":"44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.903483 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" event={"ID":"ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d","Type":"ContainerDied","Data":"b26f743dfa55532d421ccf7c513b6882f654e377346dc075b7c0420e9ccdbd69"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.903499 4921 scope.go:117] "RemoveContainer" containerID="44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.903598 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7857fc67d9-68r27" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.914262 4921 generic.go:334] "Generic (PLEG): container finished" podID="5825bf61-f4e1-4ee4-8d23-a66809b454b5" containerID="cc04f41fd604a15a3eddf6c7dcd7f49713fa5b7c4ee283263a249ec83734b358" exitCode=0 Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.914343 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5825bf61-f4e1-4ee4-8d23-a66809b454b5","Type":"ContainerDied","Data":"cc04f41fd604a15a3eddf6c7dcd7f49713fa5b7c4ee283263a249ec83734b358"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.916089 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" event={"ID":"58d7472e-2ed9-434f-b1f0-6147f9452a11","Type":"ContainerStarted","Data":"46fe56158e9f7b1129d5ba778b7a121499e643f269f2251f712fce2c80db70f7"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.920513 4921 generic.go:334] "Generic (PLEG): container finished" podID="e79610be-b2bf-4f52-b9a1-227bad68d5bf" containerID="a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b" exitCode=0 Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.920548 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" event={"ID":"e79610be-b2bf-4f52-b9a1-227bad68d5bf","Type":"ContainerDied","Data":"a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.920571 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.920584 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j" event={"ID":"e79610be-b2bf-4f52-b9a1-227bad68d5bf","Type":"ContainerDied","Data":"a5351ca1638512128b8230ec47a0050c85cf4b054341d535a40011522431827d"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.931926 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8k6" event={"ID":"125cafaf-afed-45eb-b6c9-0f06ee2637ec","Type":"ContainerStarted","Data":"226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310"} Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.949512 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vz2hx" podStartSLOduration=3.05033821 podStartE2EDuration="36.949485841s" podCreationTimestamp="2026-03-18 12:12:13 +0000 UTC" firstStartedPulling="2026-03-18 12:12:15.410742472 +0000 UTC m=+154.960663111" lastFinishedPulling="2026-03-18 12:12:49.309890113 +0000 UTC m=+188.859810742" observedRunningTime="2026-03-18 12:12:49.939471065 +0000 UTC m=+189.489391724" watchObservedRunningTime="2026-03-18 12:12:49.949485841 +0000 UTC m=+189.499406500" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.952848 4921 scope.go:117] "RemoveContainer" containerID="44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c" Mar 18 12:12:49 crc kubenswrapper[4921]: E0318 12:12:49.953788 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c\": container with ID starting with 44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c not found: ID does not exist" containerID="44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.953830 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c"} err="failed to get container status \"44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c\": rpc error: code = NotFound desc = could not find container \"44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c\": container with ID starting with 44c32d94fe0269624f76d7eb4ae43d9fda58285af88653bd736c8f067341673c not found: ID does not exist" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.953869 4921 scope.go:117] "RemoveContainer" containerID="a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.961529 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j"] Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.965567 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b55468db-8b72j"] Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.972164 4921 scope.go:117] "RemoveContainer" containerID="a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b" Mar 18 12:12:49 crc kubenswrapper[4921]: E0318 12:12:49.976378 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b\": container with ID starting with a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b not found: ID does not exist" containerID="a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.976433 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b"} err="failed to get container status \"a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b\": rpc error: code = NotFound desc = could not find container \"a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b\": container with ID starting with a62ab4d634307fa04cec9716d23a642ad16de972e985f9a52b2e7c684f261a2b not found: ID does not exist" Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.991024 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7857fc67d9-68r27"] Mar 18 12:12:49 crc kubenswrapper[4921]: I0318 12:12:49.998648 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7857fc67d9-68r27"] Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.008946 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" podStartSLOduration=6.941365381 podStartE2EDuration="50.008924399s" podCreationTimestamp="2026-03-18 12:12:00 +0000 UTC" firstStartedPulling="2026-03-18 12:12:06.431450864 +0000 UTC m=+145.981371503" lastFinishedPulling="2026-03-18 12:12:49.499009882 +0000 UTC m=+189.048930521" observedRunningTime="2026-03-18 12:12:50.007403677 +0000 UTC m=+189.557324326" watchObservedRunningTime="2026-03-18 12:12:50.008924399 +0000 UTC m=+189.558845168" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.213064 4921 csr.go:261] certificate signing request csr-p6rnr is approved, waiting to be issued Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.219981 4921 csr.go:257] certificate signing request csr-p6rnr is issued Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.732189 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4c8k6" podStartSLOduration=4.698374601 podStartE2EDuration="40.732165842s" podCreationTimestamp="2026-03-18 12:12:10 +0000 UTC" firstStartedPulling="2026-03-18 12:12:13.267993288 +0000 UTC m=+152.817913927" lastFinishedPulling="2026-03-18 12:12:49.301784529 +0000 UTC m=+188.851705168" observedRunningTime="2026-03-18 12:12:50.025533666 +0000 UTC m=+189.575454325" watchObservedRunningTime="2026-03-18 12:12:50.732165842 +0000 UTC m=+190.282086491" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734287 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bc66f4c79-zscm8"] Mar 18 12:12:50 crc kubenswrapper[4921]: E0318 12:12:50.734617 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79610be-b2bf-4f52-b9a1-227bad68d5bf" containerName="route-controller-manager" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734633 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79610be-b2bf-4f52-b9a1-227bad68d5bf" containerName="route-controller-manager" Mar 18 12:12:50 crc kubenswrapper[4921]: E0318 12:12:50.734644 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" containerName="controller-manager" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734654 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" containerName="controller-manager" Mar 18 12:12:50 crc kubenswrapper[4921]: E0318 12:12:50.734668 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734676 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734808 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" containerName="controller-manager" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734824 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4be723-81e1-4c74-a380-3ccd634a2f39" containerName="kube-multus-additional-cni-plugins" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.734833 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e79610be-b2bf-4f52-b9a1-227bad68d5bf" containerName="route-controller-manager" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.735336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.740856 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.751580 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55484698f-q62xv"] Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.754247 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.754252 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.754537 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.757463 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.758549 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.758978 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.758984 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.759128 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.758549 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.760402 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.760612 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.760797 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.763625 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.766489 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc66f4c79-zscm8"] Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.770709 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55484698f-q62xv"] Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842595 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-proxy-ca-bundles\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842659 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-client-ca\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842692 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-config\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842735 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-config\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842822 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxbh\" (UniqueName: \"kubernetes.io/projected/8345fe02-45ff-4afa-8736-fb518b9ec608-kube-api-access-cbxbh\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842883 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-serving-cert\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.842909 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-client-ca\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.843175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86tvv\" (UniqueName: \"kubernetes.io/projected/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-kube-api-access-86tvv\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.843296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8345fe02-45ff-4afa-8736-fb518b9ec608-serving-cert\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.939675 4921 generic.go:334] "Generic (PLEG): container finished" podID="58d7472e-2ed9-434f-b1f0-6147f9452a11" containerID="46fe56158e9f7b1129d5ba778b7a121499e643f269f2251f712fce2c80db70f7" exitCode=0 Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.939761 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" event={"ID":"58d7472e-2ed9-434f-b1f0-6147f9452a11","Type":"ContainerDied","Data":"46fe56158e9f7b1129d5ba778b7a121499e643f269f2251f712fce2c80db70f7"} Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-serving-cert\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947346 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-client-ca\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947392 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86tvv\" (UniqueName: \"kubernetes.io/projected/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-kube-api-access-86tvv\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947425 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8345fe02-45ff-4afa-8736-fb518b9ec608-serving-cert\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947489 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-proxy-ca-bundles\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947514 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-client-ca\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947544 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-config\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947572 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-config\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.947603 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxbh\" (UniqueName: \"kubernetes.io/projected/8345fe02-45ff-4afa-8736-fb518b9ec608-kube-api-access-cbxbh\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.948987 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-client-ca\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.949045 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-proxy-ca-bundles\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.949104 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-client-ca\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.950084 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-config\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.950377 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-config\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.954423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8345fe02-45ff-4afa-8736-fb518b9ec608-serving-cert\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.955565 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-serving-cert\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.966771 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxbh\" (UniqueName: \"kubernetes.io/projected/8345fe02-45ff-4afa-8736-fb518b9ec608-kube-api-access-cbxbh\") pod \"route-controller-manager-55484698f-q62xv\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:50 crc kubenswrapper[4921]: I0318 12:12:50.969034 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86tvv\" (UniqueName: \"kubernetes.io/projected/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-kube-api-access-86tvv\") pod \"controller-manager-7bc66f4c79-zscm8\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.067793 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.085641 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.215647 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.217473 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d" path="/var/lib/kubelet/pods/ddb3645d-e7fc-4cbd-8bdf-f815d4ad4e0d/volumes" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.218060 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e79610be-b2bf-4f52-b9a1-227bad68d5bf" path="/var/lib/kubelet/pods/e79610be-b2bf-4f52-b9a1-227bad68d5bf/volumes" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.223770 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-11 13:39:44.755597071 +0000 UTC Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.223805 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6433h26m53.531794596s for next certificate rotation Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.291457 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.291499 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.315354 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc66f4c79-zscm8"] Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.321330 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:12:51 crc kubenswrapper[4921]: E0318 12:12:51.321586 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825bf61-f4e1-4ee4-8d23-a66809b454b5" containerName="pruner" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.321598 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825bf61-f4e1-4ee4-8d23-a66809b454b5" containerName="pruner" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.321708 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5825bf61-f4e1-4ee4-8d23-a66809b454b5" containerName="pruner" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.322066 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.336504 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:12:51 crc kubenswrapper[4921]: W0318 12:12:51.344399 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb48abc_efe9_4882_9e8f_89753ddb0bb3.slice/crio-c999044af2e2f18688b4eecb0e586bb433aee7e77594a9618f06103a9be5758c WatchSource:0}: Error finding container c999044af2e2f18688b4eecb0e586bb433aee7e77594a9618f06103a9be5758c: Status 404 returned error can't find the container with id c999044af2e2f18688b4eecb0e586bb433aee7e77594a9618f06103a9be5758c Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.354885 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kube-api-access\") pod \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.354911 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kubelet-dir\") pod \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\" (UID: \"5825bf61-f4e1-4ee4-8d23-a66809b454b5\") " Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.355260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5825bf61-f4e1-4ee4-8d23-a66809b454b5" (UID: "5825bf61-f4e1-4ee4-8d23-a66809b454b5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.362604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5825bf61-f4e1-4ee4-8d23-a66809b454b5" (UID: "5825bf61-f4e1-4ee4-8d23-a66809b454b5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.456308 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-var-lock\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.456358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c57598a3-9521-4697-969f-6c365d74c4e7-kube-api-access\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.456467 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.456520 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.456531 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5825bf61-f4e1-4ee4-8d23-a66809b454b5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.557283 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.557391 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-var-lock\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.557424 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c57598a3-9521-4697-969f-6c365d74c4e7-kube-api-access\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.557439 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.557499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-var-lock\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.561451 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55484698f-q62xv"] Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.582747 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c57598a3-9521-4697-969f-6c365d74c4e7-kube-api-access\") pod \"installer-9-crc\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.667098 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.949920 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" event={"ID":"3bb48abc-efe9-4882-9e8f-89753ddb0bb3","Type":"ContainerStarted","Data":"846aba68780d2f0b5bb0c3a33168928580728b92868e3b62c33a31dc6b4f9577"} Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.950386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" event={"ID":"3bb48abc-efe9-4882-9e8f-89753ddb0bb3","Type":"ContainerStarted","Data":"c999044af2e2f18688b4eecb0e586bb433aee7e77594a9618f06103a9be5758c"} Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.951011 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.951828 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.953193 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" event={"ID":"8345fe02-45ff-4afa-8736-fb518b9ec608","Type":"ContainerStarted","Data":"ecebae7c005972aa10c7ddade2c5200176f93a89ef926d4ee9b04b4002bd76c1"} Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.953230 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" event={"ID":"8345fe02-45ff-4afa-8736-fb518b9ec608","Type":"ContainerStarted","Data":"92c8b457f428dbad19108e03454ac9c767bfe331f3d5c7cce131a53bf89e8e09"} Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.954192 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.960003 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5825bf61-f4e1-4ee4-8d23-a66809b454b5","Type":"ContainerDied","Data":"a29a34d2951a6af83db0bb37848a6426d6c9a2e83a67e79b3de1b792f08586f1"} Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.960038 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a29a34d2951a6af83db0bb37848a6426d6c9a2e83a67e79b3de1b792f08586f1" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.960094 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:12:51 crc kubenswrapper[4921]: I0318 12:12:51.974155 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.027045 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" podStartSLOduration=5.027019351 podStartE2EDuration="5.027019351s" podCreationTimestamp="2026-03-18 12:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:51.984326995 +0000 UTC m=+191.534247634" watchObservedRunningTime="2026-03-18 12:12:52.027019351 +0000 UTC m=+191.576939990" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.053706 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" podStartSLOduration=5.053683245 podStartE2EDuration="5.053683245s" podCreationTimestamp="2026-03-18 12:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:52.033704495 +0000 UTC m=+191.583625134" watchObservedRunningTime="2026-03-18 12:12:52.053683245 +0000 UTC m=+191.603603874" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.129814 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.223949 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-13 23:11:11.231565123 +0000 UTC Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.224005 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5770h58m19.007562486s for next certificate rotation Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.320669 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.449890 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4c8k6" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="registry-server" probeResult="failure" output=< Mar 18 12:12:52 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 12:12:52 crc kubenswrapper[4921]: > Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.470621 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggszn\" (UniqueName: \"kubernetes.io/projected/58d7472e-2ed9-434f-b1f0-6147f9452a11-kube-api-access-ggszn\") pod \"58d7472e-2ed9-434f-b1f0-6147f9452a11\" (UID: \"58d7472e-2ed9-434f-b1f0-6147f9452a11\") " Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.476736 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d7472e-2ed9-434f-b1f0-6147f9452a11-kube-api-access-ggszn" (OuterVolumeSpecName: "kube-api-access-ggszn") pod "58d7472e-2ed9-434f-b1f0-6147f9452a11" (UID: "58d7472e-2ed9-434f-b1f0-6147f9452a11"). InnerVolumeSpecName "kube-api-access-ggszn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.571963 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggszn\" (UniqueName: \"kubernetes.io/projected/58d7472e-2ed9-434f-b1f0-6147f9452a11-kube-api-access-ggszn\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.966293 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c57598a3-9521-4697-969f-6c365d74c4e7","Type":"ContainerStarted","Data":"22e248748409d7ca9e1e7d6765ba452b1c8a334dc77c4107f17afa3938c41b03"} Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.966341 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c57598a3-9521-4697-969f-6c365d74c4e7","Type":"ContainerStarted","Data":"ef8c177ef244a4c65dc84ccf02420b2f47630ca38ab961755f97ea323693e375"} Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.970528 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" event={"ID":"58d7472e-2ed9-434f-b1f0-6147f9452a11","Type":"ContainerDied","Data":"b58adb2adfaae33ec71d29bdb911dbe60f82bdc127af08eeb13a40956aef0e45"} Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.970672 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b58adb2adfaae33ec71d29bdb911dbe60f82bdc127af08eeb13a40956aef0e45" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.970821 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-b8gp7" Mar 18 12:12:52 crc kubenswrapper[4921]: I0318 12:12:52.986495 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.986470219 podStartE2EDuration="1.986470219s" podCreationTimestamp="2026-03-18 12:12:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:52.98214242 +0000 UTC m=+192.532063079" watchObservedRunningTime="2026-03-18 12:12:52.986470219 +0000 UTC m=+192.536390848" Mar 18 12:12:53 crc kubenswrapper[4921]: I0318 12:12:53.526130 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:53 crc kubenswrapper[4921]: I0318 12:12:53.526641 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:53 crc kubenswrapper[4921]: I0318 12:12:53.613253 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:54 crc kubenswrapper[4921]: I0318 12:12:54.484920 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:55 crc kubenswrapper[4921]: I0318 12:12:55.034104 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:55 crc kubenswrapper[4921]: I0318 12:12:55.104700 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz2hx"] Mar 18 12:12:56 crc kubenswrapper[4921]: I0318 12:12:56.991191 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vz2hx" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="registry-server" containerID="cri-o://cb98cef3fe2736ceb26d23c1a17bc20eb06bc1ab780ff27ec9e86f801b63ac58" gracePeriod=2 Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.005976 4921 generic.go:334] "Generic (PLEG): container finished" podID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerID="cb98cef3fe2736ceb26d23c1a17bc20eb06bc1ab780ff27ec9e86f801b63ac58" exitCode=0 Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.006030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz2hx" event={"ID":"cf28d70d-8b0a-4ae1-9747-99e9d42767a6","Type":"ContainerDied","Data":"cb98cef3fe2736ceb26d23c1a17bc20eb06bc1ab780ff27ec9e86f801b63ac58"} Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.096372 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.253965 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wb6b\" (UniqueName: \"kubernetes.io/projected/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-kube-api-access-7wb6b\") pod \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.254056 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-utilities\") pod \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.254184 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-catalog-content\") pod \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\" (UID: \"cf28d70d-8b0a-4ae1-9747-99e9d42767a6\") " Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.255608 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-utilities" (OuterVolumeSpecName: "utilities") pod "cf28d70d-8b0a-4ae1-9747-99e9d42767a6" (UID: "cf28d70d-8b0a-4ae1-9747-99e9d42767a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.263337 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-kube-api-access-7wb6b" (OuterVolumeSpecName: "kube-api-access-7wb6b") pod "cf28d70d-8b0a-4ae1-9747-99e9d42767a6" (UID: "cf28d70d-8b0a-4ae1-9747-99e9d42767a6"). InnerVolumeSpecName "kube-api-access-7wb6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.287089 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf28d70d-8b0a-4ae1-9747-99e9d42767a6" (UID: "cf28d70d-8b0a-4ae1-9747-99e9d42767a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.356450 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.356481 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wb6b\" (UniqueName: \"kubernetes.io/projected/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-kube-api-access-7wb6b\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:58 crc kubenswrapper[4921]: I0318 12:12:58.356492 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28d70d-8b0a-4ae1-9747-99e9d42767a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.015484 4921 generic.go:334] "Generic (PLEG): container finished" podID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerID="a808a8be1a2d62fbe6700eaf49d5a69bec8c366233449a880f421b202dcf95d9" exitCode=0 Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.015875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-km5sn" event={"ID":"6c23a325-dab9-40a8-bd8b-1f571140cdca","Type":"ContainerDied","Data":"a808a8be1a2d62fbe6700eaf49d5a69bec8c366233449a880f421b202dcf95d9"} Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.022883 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz2hx" event={"ID":"cf28d70d-8b0a-4ae1-9747-99e9d42767a6","Type":"ContainerDied","Data":"e14069dd873dbbceb1f28850baffbc20cc43c9517481f4619491ce8046341167"} Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.022936 4921 scope.go:117] "RemoveContainer" containerID="cb98cef3fe2736ceb26d23c1a17bc20eb06bc1ab780ff27ec9e86f801b63ac58" Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.022979 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz2hx" Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.047438 4921 scope.go:117] "RemoveContainer" containerID="d33e3016ab913cf8f8ceea02f83e6859a303fe76b7921b99bb8fd48cecba9df1" Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.071642 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz2hx"] Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.071839 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz2hx"] Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.093151 4921 scope.go:117] "RemoveContainer" containerID="6e24826f8a7d51093861a8d9766550ff5209df698a121eb3d2b6c38bd3679545" Mar 18 12:12:59 crc kubenswrapper[4921]: I0318 12:12:59.216241 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" path="/var/lib/kubelet/pods/cf28d70d-8b0a-4ae1-9747-99e9d42767a6/volumes" Mar 18 12:13:00 crc kubenswrapper[4921]: I0318 12:13:00.031317 4921 generic.go:334] "Generic (PLEG): container finished" podID="63c09902-e057-4d3a-811f-e068f2ebe716" containerID="7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1" exitCode=0 Mar 18 12:13:00 crc kubenswrapper[4921]: I0318 12:13:00.032098 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gvwb" event={"ID":"63c09902-e057-4d3a-811f-e068f2ebe716","Type":"ContainerDied","Data":"7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1"} Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.038596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gvwb" event={"ID":"63c09902-e057-4d3a-811f-e068f2ebe716","Type":"ContainerStarted","Data":"e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965"} Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.040891 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-km5sn" event={"ID":"6c23a325-dab9-40a8-bd8b-1f571140cdca","Type":"ContainerStarted","Data":"6aad3434490562b863b21fa2dafd86ca52f72e6c83c829ab4e9e0ff49ff875a0"} Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.042358 4921 generic.go:334] "Generic (PLEG): container finished" podID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerID="b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939" exitCode=0 Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.042392 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf9vr" event={"ID":"09b95848-38ec-4890-9cd2-83bc2e137c4a","Type":"ContainerDied","Data":"b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939"} Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.062290 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8gvwb" podStartSLOduration=2.7615742 podStartE2EDuration="51.062271128s" podCreationTimestamp="2026-03-18 12:12:10 +0000 UTC" firstStartedPulling="2026-03-18 12:12:12.223406483 +0000 UTC m=+151.773327122" lastFinishedPulling="2026-03-18 12:13:00.524103421 +0000 UTC m=+200.074024050" observedRunningTime="2026-03-18 12:13:01.060409252 +0000 UTC m=+200.610329891" watchObservedRunningTime="2026-03-18 12:13:01.062271128 +0000 UTC m=+200.612191767" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.065232 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.065420 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.078438 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-km5sn" podStartSLOduration=3.427855595 podStartE2EDuration="50.078421757s" podCreationTimestamp="2026-03-18 12:12:11 +0000 UTC" firstStartedPulling="2026-03-18 12:12:13.260948253 +0000 UTC m=+152.810868892" lastFinishedPulling="2026-03-18 12:12:59.911514415 +0000 UTC m=+199.461435054" observedRunningTime="2026-03-18 12:13:01.077668205 +0000 UTC m=+200.627588844" watchObservedRunningTime="2026-03-18 12:13:01.078421757 +0000 UTC m=+200.628342406" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.344484 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.400964 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.550254 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:13:01 crc kubenswrapper[4921]: I0318 12:13:01.550290 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.052067 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf9vr" event={"ID":"09b95848-38ec-4890-9cd2-83bc2e137c4a","Type":"ContainerStarted","Data":"becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b"} Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.054409 4921 generic.go:334] "Generic (PLEG): container finished" podID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerID="44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda" exitCode=0 Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.054482 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45chs" event={"ID":"21bdcd14-9430-4bf6-847e-6a31f0efd11a","Type":"ContainerDied","Data":"44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda"} Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.060778 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerStarted","Data":"78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2"} Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.078808 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pf9vr" podStartSLOduration=3.067109036 podStartE2EDuration="49.078788503s" podCreationTimestamp="2026-03-18 12:12:13 +0000 UTC" firstStartedPulling="2026-03-18 12:12:15.495725333 +0000 UTC m=+155.045645972" lastFinishedPulling="2026-03-18 12:13:01.5074048 +0000 UTC m=+201.057325439" observedRunningTime="2026-03-18 12:13:02.078572297 +0000 UTC m=+201.628492936" watchObservedRunningTime="2026-03-18 12:13:02.078788503 +0000 UTC m=+201.628709142" Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.114711 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8gvwb" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="registry-server" probeResult="failure" output=< Mar 18 12:13:02 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 12:13:02 crc kubenswrapper[4921]: > Mar 18 12:13:02 crc kubenswrapper[4921]: I0318 12:13:02.596567 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-km5sn" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="registry-server" probeResult="failure" output=< Mar 18 12:13:02 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 12:13:02 crc kubenswrapper[4921]: > Mar 18 12:13:03 crc kubenswrapper[4921]: I0318 12:13:03.069572 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45chs" event={"ID":"21bdcd14-9430-4bf6-847e-6a31f0efd11a","Type":"ContainerStarted","Data":"c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d"} Mar 18 12:13:03 crc kubenswrapper[4921]: I0318 12:13:03.073864 4921 generic.go:334] "Generic (PLEG): container finished" podID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerID="78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2" exitCode=0 Mar 18 12:13:03 crc kubenswrapper[4921]: I0318 12:13:03.073931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerDied","Data":"78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2"} Mar 18 12:13:03 crc kubenswrapper[4921]: I0318 12:13:03.092811 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-45chs" podStartSLOduration=2.803048557 podStartE2EDuration="52.092792044s" podCreationTimestamp="2026-03-18 12:12:11 +0000 UTC" firstStartedPulling="2026-03-18 12:12:13.242094144 +0000 UTC m=+152.792014783" lastFinishedPulling="2026-03-18 12:13:02.531837631 +0000 UTC m=+202.081758270" observedRunningTime="2026-03-18 12:13:03.089437245 +0000 UTC m=+202.639357884" watchObservedRunningTime="2026-03-18 12:13:03.092792044 +0000 UTC m=+202.642712683" Mar 18 12:13:04 crc kubenswrapper[4921]: I0318 12:13:04.255978 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:13:04 crc kubenswrapper[4921]: I0318 12:13:04.256385 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:13:05 crc kubenswrapper[4921]: I0318 12:13:05.365651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerStarted","Data":"ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2"} Mar 18 12:13:05 crc kubenswrapper[4921]: I0318 12:13:05.385440 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dshr9" podStartSLOduration=2.511165422 podStartE2EDuration="53.385425156s" podCreationTimestamp="2026-03-18 12:12:12 +0000 UTC" firstStartedPulling="2026-03-18 12:12:14.321281681 +0000 UTC m=+153.871202320" lastFinishedPulling="2026-03-18 12:13:05.195541415 +0000 UTC m=+204.745462054" observedRunningTime="2026-03-18 12:13:05.38319917 +0000 UTC m=+204.933119839" watchObservedRunningTime="2026-03-18 12:13:05.385425156 +0000 UTC m=+204.935345795" Mar 18 12:13:05 crc kubenswrapper[4921]: I0318 12:13:05.393666 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pf9vr" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="registry-server" probeResult="failure" output=< Mar 18 12:13:05 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 12:13:05 crc kubenswrapper[4921]: > Mar 18 12:13:06 crc kubenswrapper[4921]: I0318 12:13:06.371511 4921 generic.go:334] "Generic (PLEG): container finished" podID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerID="7ba9fa3b85c54e2a83ca9a470710f3d9c85f5cbe41ea3b4e5764b7abdaa9ec69" exitCode=0 Mar 18 12:13:06 crc kubenswrapper[4921]: I0318 12:13:06.371555 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hx4pr" event={"ID":"9350ca3c-fa40-4169-87f2-06ac9d6c16bf","Type":"ContainerDied","Data":"7ba9fa3b85c54e2a83ca9a470710f3d9c85f5cbe41ea3b4e5764b7abdaa9ec69"} Mar 18 12:13:07 crc kubenswrapper[4921]: I0318 12:13:07.313220 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55484698f-q62xv"] Mar 18 12:13:07 crc kubenswrapper[4921]: I0318 12:13:07.313804 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" podUID="8345fe02-45ff-4afa-8736-fb518b9ec608" containerName="route-controller-manager" containerID="cri-o://ecebae7c005972aa10c7ddade2c5200176f93a89ef926d4ee9b04b4002bd76c1" gracePeriod=30 Mar 18 12:13:07 crc kubenswrapper[4921]: I0318 12:13:07.325948 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc66f4c79-zscm8"] Mar 18 12:13:07 crc kubenswrapper[4921]: I0318 12:13:07.326243 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" podUID="3bb48abc-efe9-4882-9e8f-89753ddb0bb3" containerName="controller-manager" containerID="cri-o://846aba68780d2f0b5bb0c3a33168928580728b92868e3b62c33a31dc6b4f9577" gracePeriod=30 Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.383002 4921 generic.go:334] "Generic (PLEG): container finished" podID="3bb48abc-efe9-4882-9e8f-89753ddb0bb3" containerID="846aba68780d2f0b5bb0c3a33168928580728b92868e3b62c33a31dc6b4f9577" exitCode=0 Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.383125 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" event={"ID":"3bb48abc-efe9-4882-9e8f-89753ddb0bb3","Type":"ContainerDied","Data":"846aba68780d2f0b5bb0c3a33168928580728b92868e3b62c33a31dc6b4f9577"} Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.385964 4921 generic.go:334] "Generic (PLEG): container finished" podID="8345fe02-45ff-4afa-8736-fb518b9ec608" containerID="ecebae7c005972aa10c7ddade2c5200176f93a89ef926d4ee9b04b4002bd76c1" exitCode=0 Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.386016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" event={"ID":"8345fe02-45ff-4afa-8736-fb518b9ec608","Type":"ContainerDied","Data":"ecebae7c005972aa10c7ddade2c5200176f93a89ef926d4ee9b04b4002bd76c1"} Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.387781 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hx4pr" event={"ID":"9350ca3c-fa40-4169-87f2-06ac9d6c16bf","Type":"ContainerStarted","Data":"232cf41cf84c12e761f1428b79b970220f597e145f15174e4e71def24d3cecc6"} Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.412227 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hx4pr" podStartSLOduration=3.174417803 podStartE2EDuration="54.412211507s" podCreationTimestamp="2026-03-18 12:12:14 +0000 UTC" firstStartedPulling="2026-03-18 12:12:16.55927027 +0000 UTC m=+156.109190909" lastFinishedPulling="2026-03-18 12:13:07.797063974 +0000 UTC m=+207.346984613" observedRunningTime="2026-03-18 12:13:08.409468366 +0000 UTC m=+207.959389005" watchObservedRunningTime="2026-03-18 12:13:08.412211507 +0000 UTC m=+207.962132146" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.450542 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481331 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp"] Mar 18 12:13:08 crc kubenswrapper[4921]: E0318 12:13:08.481579 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="extract-utilities" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481598 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="extract-utilities" Mar 18 12:13:08 crc kubenswrapper[4921]: E0318 12:13:08.481614 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="registry-server" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481621 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="registry-server" Mar 18 12:13:08 crc kubenswrapper[4921]: E0318 12:13:08.481637 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="extract-content" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481644 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="extract-content" Mar 18 12:13:08 crc kubenswrapper[4921]: E0318 12:13:08.481654 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8345fe02-45ff-4afa-8736-fb518b9ec608" containerName="route-controller-manager" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481663 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8345fe02-45ff-4afa-8736-fb518b9ec608" containerName="route-controller-manager" Mar 18 12:13:08 crc kubenswrapper[4921]: E0318 12:13:08.481682 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d7472e-2ed9-434f-b1f0-6147f9452a11" containerName="oc" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481692 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d7472e-2ed9-434f-b1f0-6147f9452a11" containerName="oc" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481809 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d7472e-2ed9-434f-b1f0-6147f9452a11" containerName="oc" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481825 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8345fe02-45ff-4afa-8736-fb518b9ec608" containerName="route-controller-manager" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.481839 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf28d70d-8b0a-4ae1-9747-99e9d42767a6" containerName="registry-server" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.482374 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.494728 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.534759 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp"] Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.615872 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8345fe02-45ff-4afa-8736-fb518b9ec608-serving-cert\") pod \"8345fe02-45ff-4afa-8736-fb518b9ec608\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.615932 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-serving-cert\") pod \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.615981 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbxbh\" (UniqueName: \"kubernetes.io/projected/8345fe02-45ff-4afa-8736-fb518b9ec608-kube-api-access-cbxbh\") pod \"8345fe02-45ff-4afa-8736-fb518b9ec608\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-config\") pod \"8345fe02-45ff-4afa-8736-fb518b9ec608\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616080 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-proxy-ca-bundles\") pod \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616097 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-config\") pod \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616139 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-client-ca\") pod \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616180 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-client-ca\") pod \"8345fe02-45ff-4afa-8736-fb518b9ec608\" (UID: \"8345fe02-45ff-4afa-8736-fb518b9ec608\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616205 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86tvv\" (UniqueName: \"kubernetes.io/projected/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-kube-api-access-86tvv\") pod \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\" (UID: \"3bb48abc-efe9-4882-9e8f-89753ddb0bb3\") " Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616330 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk2jz\" (UniqueName: \"kubernetes.io/projected/b170df13-7ef3-44a2-9c30-e8a3230173a6-kube-api-access-sk2jz\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616391 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b170df13-7ef3-44a2-9c30-e8a3230173a6-serving-cert\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616423 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-client-ca\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616448 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-config\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616898 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-client-ca" (OuterVolumeSpecName: "client-ca") pod "3bb48abc-efe9-4882-9e8f-89753ddb0bb3" (UID: "3bb48abc-efe9-4882-9e8f-89753ddb0bb3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.616915 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3bb48abc-efe9-4882-9e8f-89753ddb0bb3" (UID: "3bb48abc-efe9-4882-9e8f-89753ddb0bb3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.617066 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-config" (OuterVolumeSpecName: "config") pod "3bb48abc-efe9-4882-9e8f-89753ddb0bb3" (UID: "3bb48abc-efe9-4882-9e8f-89753ddb0bb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.617353 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-client-ca" (OuterVolumeSpecName: "client-ca") pod "8345fe02-45ff-4afa-8736-fb518b9ec608" (UID: "8345fe02-45ff-4afa-8736-fb518b9ec608"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.617418 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-config" (OuterVolumeSpecName: "config") pod "8345fe02-45ff-4afa-8736-fb518b9ec608" (UID: "8345fe02-45ff-4afa-8736-fb518b9ec608"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.628319 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3bb48abc-efe9-4882-9e8f-89753ddb0bb3" (UID: "3bb48abc-efe9-4882-9e8f-89753ddb0bb3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.628384 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8345fe02-45ff-4afa-8736-fb518b9ec608-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8345fe02-45ff-4afa-8736-fb518b9ec608" (UID: "8345fe02-45ff-4afa-8736-fb518b9ec608"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.628401 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-kube-api-access-86tvv" (OuterVolumeSpecName: "kube-api-access-86tvv") pod "3bb48abc-efe9-4882-9e8f-89753ddb0bb3" (UID: "3bb48abc-efe9-4882-9e8f-89753ddb0bb3"). InnerVolumeSpecName "kube-api-access-86tvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.628450 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8345fe02-45ff-4afa-8736-fb518b9ec608-kube-api-access-cbxbh" (OuterVolumeSpecName: "kube-api-access-cbxbh") pod "8345fe02-45ff-4afa-8736-fb518b9ec608" (UID: "8345fe02-45ff-4afa-8736-fb518b9ec608"). InnerVolumeSpecName "kube-api-access-cbxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717400 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk2jz\" (UniqueName: \"kubernetes.io/projected/b170df13-7ef3-44a2-9c30-e8a3230173a6-kube-api-access-sk2jz\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717802 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b170df13-7ef3-44a2-9c30-e8a3230173a6-serving-cert\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717847 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-client-ca\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-config\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717927 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717944 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717956 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86tvv\" (UniqueName: \"kubernetes.io/projected/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-kube-api-access-86tvv\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717967 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8345fe02-45ff-4afa-8736-fb518b9ec608-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717978 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717988 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbxbh\" (UniqueName: \"kubernetes.io/projected/8345fe02-45ff-4afa-8736-fb518b9ec608-kube-api-access-cbxbh\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.717998 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8345fe02-45ff-4afa-8736-fb518b9ec608-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.718007 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.718017 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bb48abc-efe9-4882-9e8f-89753ddb0bb3-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.718904 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-client-ca\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.719142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-config\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.722414 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b170df13-7ef3-44a2-9c30-e8a3230173a6-serving-cert\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.732956 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk2jz\" (UniqueName: \"kubernetes.io/projected/b170df13-7ef3-44a2-9c30-e8a3230173a6-kube-api-access-sk2jz\") pod \"route-controller-manager-66f69946f6-gqlqp\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.808907 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:08 crc kubenswrapper[4921]: I0318 12:13:08.997602 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp"] Mar 18 12:13:09 crc kubenswrapper[4921]: W0318 12:13:09.003993 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb170df13_7ef3_44a2_9c30_e8a3230173a6.slice/crio-2f6b43d36d5177c3d51960d11409592c209ab6017446f248aa43014585b50636 WatchSource:0}: Error finding container 2f6b43d36d5177c3d51960d11409592c209ab6017446f248aa43014585b50636: Status 404 returned error can't find the container with id 2f6b43d36d5177c3d51960d11409592c209ab6017446f248aa43014585b50636 Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.402488 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" event={"ID":"b170df13-7ef3-44a2-9c30-e8a3230173a6","Type":"ContainerStarted","Data":"418dfc1a57465d44e7a641a9686fd979c39bf62fc73a7982cc237b3be52967f2"} Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.402842 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" event={"ID":"b170df13-7ef3-44a2-9c30-e8a3230173a6","Type":"ContainerStarted","Data":"2f6b43d36d5177c3d51960d11409592c209ab6017446f248aa43014585b50636"} Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.402865 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.404733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" event={"ID":"3bb48abc-efe9-4882-9e8f-89753ddb0bb3","Type":"ContainerDied","Data":"c999044af2e2f18688b4eecb0e586bb433aee7e77594a9618f06103a9be5758c"} Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.404765 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc66f4c79-zscm8" Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.404773 4921 scope.go:117] "RemoveContainer" containerID="846aba68780d2f0b5bb0c3a33168928580728b92868e3b62c33a31dc6b4f9577" Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.407336 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.407311 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55484698f-q62xv" event={"ID":"8345fe02-45ff-4afa-8736-fb518b9ec608","Type":"ContainerDied","Data":"92c8b457f428dbad19108e03454ac9c767bfe331f3d5c7cce131a53bf89e8e09"} Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.424984 4921 scope.go:117] "RemoveContainer" containerID="ecebae7c005972aa10c7ddade2c5200176f93a89ef926d4ee9b04b4002bd76c1" Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.426904 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" podStartSLOduration=2.426892709 podStartE2EDuration="2.426892709s" podCreationTimestamp="2026-03-18 12:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:09.426228139 +0000 UTC m=+208.976148778" watchObservedRunningTime="2026-03-18 12:13:09.426892709 +0000 UTC m=+208.976813348" Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.440923 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc66f4c79-zscm8"] Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.444866 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bc66f4c79-zscm8"] Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.459245 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55484698f-q62xv"] Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.461710 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55484698f-q62xv"] Mar 18 12:13:09 crc kubenswrapper[4921]: I0318 12:13:09.657297 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.751063 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6696c6db88-7l6ph"] Mar 18 12:13:10 crc kubenswrapper[4921]: E0318 12:13:10.751309 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb48abc-efe9-4882-9e8f-89753ddb0bb3" containerName="controller-manager" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.751323 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb48abc-efe9-4882-9e8f-89753ddb0bb3" containerName="controller-manager" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.751423 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb48abc-efe9-4882-9e8f-89753ddb0bb3" containerName="controller-manager" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.751820 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.755033 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.755578 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.755864 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.756167 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.756418 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.757594 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.764592 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.773254 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6696c6db88-7l6ph"] Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.844367 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-serving-cert\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.844486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7htr\" (UniqueName: \"kubernetes.io/projected/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-kube-api-access-n7htr\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.844604 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-config\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.844667 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-client-ca\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.844748 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-proxy-ca-bundles\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.945665 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-config\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.945751 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-client-ca\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.945809 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-proxy-ca-bundles\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.946266 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-serving-cert\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.946358 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7htr\" (UniqueName: \"kubernetes.io/projected/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-kube-api-access-n7htr\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.947482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-proxy-ca-bundles\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.947865 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-config\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.948227 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-client-ca\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.956701 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-serving-cert\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:10 crc kubenswrapper[4921]: I0318 12:13:10.966452 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7htr\" (UniqueName: \"kubernetes.io/projected/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-kube-api-access-n7htr\") pod \"controller-manager-6696c6db88-7l6ph\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.068268 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.183156 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.241543 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb48abc-efe9-4882-9e8f-89753ddb0bb3" path="/var/lib/kubelet/pods/3bb48abc-efe9-4882-9e8f-89753ddb0bb3/volumes" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.242999 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8345fe02-45ff-4afa-8736-fb518b9ec608" path="/var/lib/kubelet/pods/8345fe02-45ff-4afa-8736-fb518b9ec608/volumes" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.243620 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.300020 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz82p"] Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.546632 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6696c6db88-7l6ph"] Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.600185 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.645811 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.687388 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.687455 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:13:11 crc kubenswrapper[4921]: I0318 12:13:11.722085 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:13:12 crc kubenswrapper[4921]: I0318 12:13:12.426918 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" event={"ID":"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d","Type":"ContainerStarted","Data":"e543ea08145709098e59faecd4a891b77a63cded72be7b7d69267aa3492fbd2c"} Mar 18 12:13:12 crc kubenswrapper[4921]: I0318 12:13:12.427274 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" event={"ID":"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d","Type":"ContainerStarted","Data":"48d485b41d1adcb5d89abb0f841a5d1af3089ffbc64c2397727a3576d692f2ce"} Mar 18 12:13:12 crc kubenswrapper[4921]: I0318 12:13:12.511390 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.104511 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-km5sn"] Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.127817 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.127881 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.166864 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.432311 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-km5sn" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="registry-server" containerID="cri-o://6aad3434490562b863b21fa2dafd86ca52f72e6c83c829ab4e9e0ff49ff875a0" gracePeriod=2 Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.432810 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.438128 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.458279 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" podStartSLOduration=6.45826053 podStartE2EDuration="6.45826053s" podCreationTimestamp="2026-03-18 12:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:13.457632671 +0000 UTC m=+213.007553310" watchObservedRunningTime="2026-03-18 12:13:13.45826053 +0000 UTC m=+213.008181169" Mar 18 12:13:13 crc kubenswrapper[4921]: I0318 12:13:13.483616 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.106858 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45chs"] Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.317682 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.376859 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.445365 4921 generic.go:334] "Generic (PLEG): container finished" podID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerID="6aad3434490562b863b21fa2dafd86ca52f72e6c83c829ab4e9e0ff49ff875a0" exitCode=0 Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.445413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-km5sn" event={"ID":"6c23a325-dab9-40a8-bd8b-1f571140cdca","Type":"ContainerDied","Data":"6aad3434490562b863b21fa2dafd86ca52f72e6c83c829ab4e9e0ff49ff875a0"} Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.445812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-km5sn" event={"ID":"6c23a325-dab9-40a8-bd8b-1f571140cdca","Type":"ContainerDied","Data":"d0cb0d358d695c8356775df1a699e0b4e499e900236af3818ea8facc5bd4b728"} Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.445835 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0cb0d358d695c8356775df1a699e0b4e499e900236af3818ea8facc5bd4b728" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.446647 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-45chs" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="registry-server" containerID="cri-o://c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d" gracePeriod=2 Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.466104 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.592245 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-utilities\") pod \"6c23a325-dab9-40a8-bd8b-1f571140cdca\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.592380 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws22g\" (UniqueName: \"kubernetes.io/projected/6c23a325-dab9-40a8-bd8b-1f571140cdca-kube-api-access-ws22g\") pod \"6c23a325-dab9-40a8-bd8b-1f571140cdca\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.592455 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-catalog-content\") pod \"6c23a325-dab9-40a8-bd8b-1f571140cdca\" (UID: \"6c23a325-dab9-40a8-bd8b-1f571140cdca\") " Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.593725 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-utilities" (OuterVolumeSpecName: "utilities") pod "6c23a325-dab9-40a8-bd8b-1f571140cdca" (UID: "6c23a325-dab9-40a8-bd8b-1f571140cdca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.597651 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c23a325-dab9-40a8-bd8b-1f571140cdca-kube-api-access-ws22g" (OuterVolumeSpecName: "kube-api-access-ws22g") pod "6c23a325-dab9-40a8-bd8b-1f571140cdca" (UID: "6c23a325-dab9-40a8-bd8b-1f571140cdca"). InnerVolumeSpecName "kube-api-access-ws22g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.663786 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c23a325-dab9-40a8-bd8b-1f571140cdca" (UID: "6c23a325-dab9-40a8-bd8b-1f571140cdca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.694528 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.694572 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws22g\" (UniqueName: \"kubernetes.io/projected/6c23a325-dab9-40a8-bd8b-1f571140cdca-kube-api-access-ws22g\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.694622 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23a325-dab9-40a8-bd8b-1f571140cdca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.751205 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.751345 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.791002 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.868614 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.997679 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-utilities\") pod \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.997780 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-catalog-content\") pod \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.997813 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lj7s\" (UniqueName: \"kubernetes.io/projected/21bdcd14-9430-4bf6-847e-6a31f0efd11a-kube-api-access-4lj7s\") pod \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\" (UID: \"21bdcd14-9430-4bf6-847e-6a31f0efd11a\") " Mar 18 12:13:14 crc kubenswrapper[4921]: I0318 12:13:14.998565 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-utilities" (OuterVolumeSpecName: "utilities") pod "21bdcd14-9430-4bf6-847e-6a31f0efd11a" (UID: "21bdcd14-9430-4bf6-847e-6a31f0efd11a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.001243 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bdcd14-9430-4bf6-847e-6a31f0efd11a-kube-api-access-4lj7s" (OuterVolumeSpecName: "kube-api-access-4lj7s") pod "21bdcd14-9430-4bf6-847e-6a31f0efd11a" (UID: "21bdcd14-9430-4bf6-847e-6a31f0efd11a"). InnerVolumeSpecName "kube-api-access-4lj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.049928 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21bdcd14-9430-4bf6-847e-6a31f0efd11a" (UID: "21bdcd14-9430-4bf6-847e-6a31f0efd11a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.098873 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.098915 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lj7s\" (UniqueName: \"kubernetes.io/projected/21bdcd14-9430-4bf6-847e-6a31f0efd11a-kube-api-access-4lj7s\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.098979 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21bdcd14-9430-4bf6-847e-6a31f0efd11a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.463129 4921 generic.go:334] "Generic (PLEG): container finished" podID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerID="c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d" exitCode=0 Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.463475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45chs" event={"ID":"21bdcd14-9430-4bf6-847e-6a31f0efd11a","Type":"ContainerDied","Data":"c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d"} Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.463569 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45chs" event={"ID":"21bdcd14-9430-4bf6-847e-6a31f0efd11a","Type":"ContainerDied","Data":"9b1b74bf97966a75a0db23e0875c8aa088669ca018ff8206d60e3bbb8d8435f4"} Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.463653 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45chs" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.463708 4921 scope.go:117] "RemoveContainer" containerID="c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.463744 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-km5sn" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.489395 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-km5sn"] Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.489638 4921 scope.go:117] "RemoveContainer" containerID="44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.496420 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-km5sn"] Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.505475 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45chs"] Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.508346 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-45chs"] Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.510528 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.516539 4921 scope.go:117] "RemoveContainer" containerID="a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.540052 4921 scope.go:117] "RemoveContainer" containerID="c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d" Mar 18 12:13:15 crc kubenswrapper[4921]: E0318 12:13:15.540854 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d\": container with ID starting with c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d not found: ID does not exist" containerID="c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.540906 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d"} err="failed to get container status \"c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d\": rpc error: code = NotFound desc = could not find container \"c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d\": container with ID starting with c22c726f789755aa0abd87d4f11b16adb573a3841f4c758596880ebae213e98d not found: ID does not exist" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.540939 4921 scope.go:117] "RemoveContainer" containerID="44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda" Mar 18 12:13:15 crc kubenswrapper[4921]: E0318 12:13:15.541500 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda\": container with ID starting with 44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda not found: ID does not exist" containerID="44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.541582 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda"} err="failed to get container status \"44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda\": rpc error: code = NotFound desc = could not find container \"44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda\": container with ID starting with 44e745729d1f926aebed6dbcacf6dc668c6e26f78a04449a2504bd8ca4ca7fda not found: ID does not exist" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.541633 4921 scope.go:117] "RemoveContainer" containerID="a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe" Mar 18 12:13:15 crc kubenswrapper[4921]: E0318 12:13:15.542196 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe\": container with ID starting with a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe not found: ID does not exist" containerID="a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe" Mar 18 12:13:15 crc kubenswrapper[4921]: I0318 12:13:15.542230 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe"} err="failed to get container status \"a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe\": rpc error: code = NotFound desc = could not find container \"a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe\": container with ID starting with a5978d2d0bbe001eaa8462238a2509409a2d966ba785cb90527cd90e9e5aa0fe not found: ID does not exist" Mar 18 12:13:17 crc kubenswrapper[4921]: I0318 12:13:17.218936 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" path="/var/lib/kubelet/pods/21bdcd14-9430-4bf6-847e-6a31f0efd11a/volumes" Mar 18 12:13:17 crc kubenswrapper[4921]: I0318 12:13:17.221022 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" path="/var/lib/kubelet/pods/6c23a325-dab9-40a8-bd8b-1f571140cdca/volumes" Mar 18 12:13:18 crc kubenswrapper[4921]: I0318 12:13:18.508176 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hx4pr"] Mar 18 12:13:18 crc kubenswrapper[4921]: I0318 12:13:18.508475 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hx4pr" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="registry-server" containerID="cri-o://232cf41cf84c12e761f1428b79b970220f597e145f15174e4e71def24d3cecc6" gracePeriod=2 Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.488090 4921 generic.go:334] "Generic (PLEG): container finished" podID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerID="232cf41cf84c12e761f1428b79b970220f597e145f15174e4e71def24d3cecc6" exitCode=0 Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.488160 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hx4pr" event={"ID":"9350ca3c-fa40-4169-87f2-06ac9d6c16bf","Type":"ContainerDied","Data":"232cf41cf84c12e761f1428b79b970220f597e145f15174e4e71def24d3cecc6"} Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.544344 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.659128 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-catalog-content\") pod \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.659247 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m6kt\" (UniqueName: \"kubernetes.io/projected/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-kube-api-access-2m6kt\") pod \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.659290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-utilities\") pod \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\" (UID: \"9350ca3c-fa40-4169-87f2-06ac9d6c16bf\") " Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.660047 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-utilities" (OuterVolumeSpecName: "utilities") pod "9350ca3c-fa40-4169-87f2-06ac9d6c16bf" (UID: "9350ca3c-fa40-4169-87f2-06ac9d6c16bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.672349 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-kube-api-access-2m6kt" (OuterVolumeSpecName: "kube-api-access-2m6kt") pod "9350ca3c-fa40-4169-87f2-06ac9d6c16bf" (UID: "9350ca3c-fa40-4169-87f2-06ac9d6c16bf"). InnerVolumeSpecName "kube-api-access-2m6kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.760735 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.760821 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m6kt\" (UniqueName: \"kubernetes.io/projected/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-kube-api-access-2m6kt\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.796357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9350ca3c-fa40-4169-87f2-06ac9d6c16bf" (UID: "9350ca3c-fa40-4169-87f2-06ac9d6c16bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:13:19 crc kubenswrapper[4921]: I0318 12:13:19.861813 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9350ca3c-fa40-4169-87f2-06ac9d6c16bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.501908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hx4pr" event={"ID":"9350ca3c-fa40-4169-87f2-06ac9d6c16bf","Type":"ContainerDied","Data":"ae2f10a1ecd7c55ba6b585b74ce63726a652ff3799e3185f6eef613d94a604c9"} Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.501976 4921 scope.go:117] "RemoveContainer" containerID="232cf41cf84c12e761f1428b79b970220f597e145f15174e4e71def24d3cecc6" Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.502001 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hx4pr" Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.526081 4921 scope.go:117] "RemoveContainer" containerID="7ba9fa3b85c54e2a83ca9a470710f3d9c85f5cbe41ea3b4e5764b7abdaa9ec69" Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.537398 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hx4pr"] Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.546012 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hx4pr"] Mar 18 12:13:20 crc kubenswrapper[4921]: I0318 12:13:20.553553 4921 scope.go:117] "RemoveContainer" containerID="9abd892cccae12932d757e6c9f61b16311ada66f1adb273fcb23ed70fcb56f62" Mar 18 12:13:21 crc kubenswrapper[4921]: I0318 12:13:21.215533 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" path="/var/lib/kubelet/pods/9350ca3c-fa40-4169-87f2-06ac9d6c16bf/volumes" Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.279057 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6696c6db88-7l6ph"] Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.279958 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" podUID="aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" containerName="controller-manager" containerID="cri-o://e543ea08145709098e59faecd4a891b77a63cded72be7b7d69267aa3492fbd2c" gracePeriod=30 Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.374852 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp"] Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.375104 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" podUID="b170df13-7ef3-44a2-9c30-e8a3230173a6" containerName="route-controller-manager" containerID="cri-o://418dfc1a57465d44e7a641a9686fd979c39bf62fc73a7982cc237b3be52967f2" gracePeriod=30 Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.544131 4921 generic.go:334] "Generic (PLEG): container finished" podID="b170df13-7ef3-44a2-9c30-e8a3230173a6" containerID="418dfc1a57465d44e7a641a9686fd979c39bf62fc73a7982cc237b3be52967f2" exitCode=0 Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.544200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" event={"ID":"b170df13-7ef3-44a2-9c30-e8a3230173a6","Type":"ContainerDied","Data":"418dfc1a57465d44e7a641a9686fd979c39bf62fc73a7982cc237b3be52967f2"} Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.545949 4921 generic.go:334] "Generic (PLEG): container finished" podID="aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" containerID="e543ea08145709098e59faecd4a891b77a63cded72be7b7d69267aa3492fbd2c" exitCode=0 Mar 18 12:13:27 crc kubenswrapper[4921]: I0318 12:13:27.545978 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" event={"ID":"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d","Type":"ContainerDied","Data":"e543ea08145709098e59faecd4a891b77a63cded72be7b7d69267aa3492fbd2c"} Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.409533 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466547 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b"] Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466756 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="extract-content" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466768 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="extract-content" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466777 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="extract-utilities" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466783 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="extract-utilities" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466793 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="extract-utilities" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466799 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="extract-utilities" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466808 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b170df13-7ef3-44a2-9c30-e8a3230173a6" containerName="route-controller-manager" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466814 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b170df13-7ef3-44a2-9c30-e8a3230173a6" containerName="route-controller-manager" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466824 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466830 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466839 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="extract-content" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466845 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="extract-content" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466855 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="extract-utilities" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466861 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="extract-utilities" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466872 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="extract-content" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466877 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="extract-content" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466887 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466892 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: E0318 12:13:28.466900 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466905 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.466992 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bdcd14-9430-4bf6-847e-6a31f0efd11a" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.467004 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c23a325-dab9-40a8-bd8b-1f571140cdca" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.467013 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9350ca3c-fa40-4169-87f2-06ac9d6c16bf" containerName="registry-server" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.467020 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b170df13-7ef3-44a2-9c30-e8a3230173a6" containerName="route-controller-manager" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.467378 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.468820 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b"] Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.476787 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.481562 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk2jz\" (UniqueName: \"kubernetes.io/projected/b170df13-7ef3-44a2-9c30-e8a3230173a6-kube-api-access-sk2jz\") pod \"b170df13-7ef3-44a2-9c30-e8a3230173a6\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.481639 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-config\") pod \"b170df13-7ef3-44a2-9c30-e8a3230173a6\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.481675 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b170df13-7ef3-44a2-9c30-e8a3230173a6-serving-cert\") pod \"b170df13-7ef3-44a2-9c30-e8a3230173a6\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.481693 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-client-ca\") pod \"b170df13-7ef3-44a2-9c30-e8a3230173a6\" (UID: \"b170df13-7ef3-44a2-9c30-e8a3230173a6\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.482791 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "b170df13-7ef3-44a2-9c30-e8a3230173a6" (UID: "b170df13-7ef3-44a2-9c30-e8a3230173a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.482897 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-config" (OuterVolumeSpecName: "config") pod "b170df13-7ef3-44a2-9c30-e8a3230173a6" (UID: "b170df13-7ef3-44a2-9c30-e8a3230173a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.488355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b170df13-7ef3-44a2-9c30-e8a3230173a6-kube-api-access-sk2jz" (OuterVolumeSpecName: "kube-api-access-sk2jz") pod "b170df13-7ef3-44a2-9c30-e8a3230173a6" (UID: "b170df13-7ef3-44a2-9c30-e8a3230173a6"). InnerVolumeSpecName "kube-api-access-sk2jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.496589 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b170df13-7ef3-44a2-9c30-e8a3230173a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b170df13-7ef3-44a2-9c30-e8a3230173a6" (UID: "b170df13-7ef3-44a2-9c30-e8a3230173a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.551718 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" event={"ID":"b170df13-7ef3-44a2-9c30-e8a3230173a6","Type":"ContainerDied","Data":"2f6b43d36d5177c3d51960d11409592c209ab6017446f248aa43014585b50636"} Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.551777 4921 scope.go:117] "RemoveContainer" containerID="418dfc1a57465d44e7a641a9686fd979c39bf62fc73a7982cc237b3be52967f2" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.551885 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.557493 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" event={"ID":"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d","Type":"ContainerDied","Data":"48d485b41d1adcb5d89abb0f841a5d1af3089ffbc64c2397727a3576d692f2ce"} Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.557614 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6696c6db88-7l6ph" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.582883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-config\") pod \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583089 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-client-ca\") pod \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583261 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7htr\" (UniqueName: \"kubernetes.io/projected/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-kube-api-access-n7htr\") pod \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583364 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-proxy-ca-bundles\") pod \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583466 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-serving-cert\") pod \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\" (UID: \"aeca5747-be46-4393-bfa5-9ca7f0f5ad9d\") " Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583724 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-client-ca\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583833 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-config\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583944 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcn8m\" (UniqueName: \"kubernetes.io/projected/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-kube-api-access-hcn8m\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.584172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-serving-cert\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.584378 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk2jz\" (UniqueName: \"kubernetes.io/projected/b170df13-7ef3-44a2-9c30-e8a3230173a6-kube-api-access-sk2jz\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.584476 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.584556 4921 scope.go:117] "RemoveContainer" containerID="e543ea08145709098e59faecd4a891b77a63cded72be7b7d69267aa3492fbd2c" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.583721 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-config" (OuterVolumeSpecName: "config") pod "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" (UID: "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.585142 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b170df13-7ef3-44a2-9c30-e8a3230173a6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.585327 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b170df13-7ef3-44a2-9c30-e8a3230173a6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.585025 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-client-ca" (OuterVolumeSpecName: "client-ca") pod "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" (UID: "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.585097 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" (UID: "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.590269 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" (UID: "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.596655 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-kube-api-access-n7htr" (OuterVolumeSpecName: "kube-api-access-n7htr") pod "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" (UID: "aeca5747-be46-4393-bfa5-9ca7f0f5ad9d"). InnerVolumeSpecName "kube-api-access-n7htr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.598227 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp"] Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.601677 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f69946f6-gqlqp"] Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686670 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcn8m\" (UniqueName: \"kubernetes.io/projected/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-kube-api-access-hcn8m\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686768 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-serving-cert\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686836 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-client-ca\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686862 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-config\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686908 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686922 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686938 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686948 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.686958 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7htr\" (UniqueName: \"kubernetes.io/projected/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d-kube-api-access-n7htr\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.688087 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-client-ca\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.688221 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-config\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.697455 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-serving-cert\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.709259 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcn8m\" (UniqueName: \"kubernetes.io/projected/82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5-kube-api-access-hcn8m\") pod \"route-controller-manager-7c8fdb45bd-zhx4b\" (UID: \"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5\") " pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.809406 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.895237 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6696c6db88-7l6ph"] Mar 18 12:13:28 crc kubenswrapper[4921]: I0318 12:13:28.898578 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6696c6db88-7l6ph"] Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.227801 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" path="/var/lib/kubelet/pods/aeca5747-be46-4393-bfa5-9ca7f0f5ad9d/volumes" Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.229560 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b170df13-7ef3-44a2-9c30-e8a3230173a6" path="/var/lib/kubelet/pods/b170df13-7ef3-44a2-9c30-e8a3230173a6/volumes" Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.230529 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b"] Mar 18 12:13:29 crc kubenswrapper[4921]: W0318 12:13:29.239700 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f4dc40_edd9_4add_9e4a_ac9cf4f9e7c5.slice/crio-b68d0778b045aa525049c2e13710207c32d77d849651e3a03e9e3ba47356b2e1 WatchSource:0}: Error finding container b68d0778b045aa525049c2e13710207c32d77d849651e3a03e9e3ba47356b2e1: Status 404 returned error can't find the container with id b68d0778b045aa525049c2e13710207c32d77d849651e3a03e9e3ba47356b2e1 Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.568919 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" event={"ID":"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5","Type":"ContainerStarted","Data":"40d72dbecc3d80007c919164fcb16caad31eef47e92f827474c5d829d4bc922d"} Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.568971 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" event={"ID":"82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5","Type":"ContainerStarted","Data":"b68d0778b045aa525049c2e13710207c32d77d849651e3a03e9e3ba47356b2e1"} Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.569342 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.590219 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" podStartSLOduration=2.590195984 podStartE2EDuration="2.590195984s" podCreationTimestamp="2026-03-18 12:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:29.59003795 +0000 UTC m=+229.139958589" watchObservedRunningTime="2026-03-18 12:13:29.590195984 +0000 UTC m=+229.140116633" Mar 18 12:13:29 crc kubenswrapper[4921]: I0318 12:13:29.733990 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.271757 4921 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.272099 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" containerName="controller-manager" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.272163 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" containerName="controller-manager" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.272325 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeca5747-be46-4393-bfa5-9ca7f0f5ad9d" containerName="controller-manager" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.272817 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.272897 4921 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.273450 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8" gracePeriod=15 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.273543 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697" gracePeriod=15 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.273473 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe" gracePeriod=15 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.273540 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f" gracePeriod=15 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.273568 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a" gracePeriod=15 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274214 4921 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274420 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274460 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274472 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274480 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274493 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274503 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274513 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274522 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274533 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274541 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274550 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274559 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274580 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274600 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274617 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274628 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.274643 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274653 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274798 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274815 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274827 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274895 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274913 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274926 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.274939 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.275140 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.275158 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.275332 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.275357 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.374776 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.375893 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.380478 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.380982 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.381407 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.381445 4921 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.381811 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="200ms" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419277 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419298 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419327 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419378 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.419399 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520183 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520252 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520373 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520406 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520428 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520462 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520492 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520626 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520559 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520666 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520819 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.520833 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.580190 4921 generic.go:334] "Generic (PLEG): container finished" podID="c57598a3-9521-4697-969f-6c365d74c4e7" containerID="22e248748409d7ca9e1e7d6765ba452b1c8a334dc77c4107f17afa3938c41b03" exitCode=0 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.580253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c57598a3-9521-4697-969f-6c365d74c4e7","Type":"ContainerDied","Data":"22e248748409d7ca9e1e7d6765ba452b1c8a334dc77c4107f17afa3938c41b03"} Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.580938 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.581448 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.582222 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="400ms" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.583821 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.585171 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.585929 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a" exitCode=0 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.585962 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe" exitCode=0 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.585975 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f" exitCode=0 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.585985 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697" exitCode=2 Mar 18 12:13:30 crc kubenswrapper[4921]: I0318 12:13:30.585995 4921 scope.go:117] "RemoveContainer" containerID="3de8b1f9dc03a53b56831fba91cebfa22d4ccd291da7a31a3bc6e21535a03a15" Mar 18 12:13:30 crc kubenswrapper[4921]: E0318 12:13:30.983554 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="800ms" Mar 18 12:13:31 crc kubenswrapper[4921]: I0318 12:13:31.213789 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: I0318 12:13:31.214899 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: I0318 12:13:31.595828 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.643498 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.644261 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.645166 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.645553 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.645883 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.646010 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:13:31 crc kubenswrapper[4921]: E0318 12:13:31.784780 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="1.6s" Mar 18 12:13:31 crc kubenswrapper[4921]: I0318 12:13:31.874775 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:13:31 crc kubenswrapper[4921]: I0318 12:13:31.875593 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.044052 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c57598a3-9521-4697-969f-6c365d74c4e7-kube-api-access\") pod \"c57598a3-9521-4697-969f-6c365d74c4e7\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.044139 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-kubelet-dir\") pod \"c57598a3-9521-4697-969f-6c365d74c4e7\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.044165 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-var-lock\") pod \"c57598a3-9521-4697-969f-6c365d74c4e7\" (UID: \"c57598a3-9521-4697-969f-6c365d74c4e7\") " Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.044483 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-var-lock" (OuterVolumeSpecName: "var-lock") pod "c57598a3-9521-4697-969f-6c365d74c4e7" (UID: "c57598a3-9521-4697-969f-6c365d74c4e7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.044843 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c57598a3-9521-4697-969f-6c365d74c4e7" (UID: "c57598a3-9521-4697-969f-6c365d74c4e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.055644 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57598a3-9521-4697-969f-6c365d74c4e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c57598a3-9521-4697-969f-6c365d74c4e7" (UID: "c57598a3-9521-4697-969f-6c365d74c4e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.146316 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c57598a3-9521-4697-969f-6c365d74c4e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.146377 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.146392 4921 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c57598a3-9521-4697-969f-6c365d74c4e7-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.605170 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c57598a3-9521-4697-969f-6c365d74c4e7","Type":"ContainerDied","Data":"ef8c177ef244a4c65dc84ccf02420b2f47630ca38ab961755f97ea323693e375"} Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.605430 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef8c177ef244a4c65dc84ccf02420b2f47630ca38ab961755f97ea323693e375" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.605245 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.608666 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.609528 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8" exitCode=0 Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.670934 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.674561 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.675645 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.676242 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.676457 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.752940 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753093 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753275 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753332 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753486 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753650 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753957 4921 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753981 4921 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:32 crc kubenswrapper[4921]: I0318 12:13:32.753994 4921 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.222230 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 12:13:33 crc kubenswrapper[4921]: E0318 12:13:33.386799 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="3.2s" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.618386 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.619913 4921 scope.go:117] "RemoveContainer" containerID="9a627ad78a9d58434c6c6cccb2f757b8fd4982c47e8c2c9ccf32ee165abea29a" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.620156 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.620921 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.621226 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.623440 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.623684 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.641068 4921 scope.go:117] "RemoveContainer" containerID="cf9af6a9a562a5c9e9ef070183dffd745fdd012210e629cfc2a9f28ce90721fe" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.655962 4921 scope.go:117] "RemoveContainer" containerID="8db619a84d7184bc56db3cf6b988c60c5cca030ea1bc371e77c1131c1548ea5f" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.672229 4921 scope.go:117] "RemoveContainer" containerID="2f77bcbf5e807f2ee5a7f1f32e194ae79575a7175fbd320959b367988b039697" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.688045 4921 scope.go:117] "RemoveContainer" containerID="389906f7c95e404bed933d46f2f88fc5d2e786e5fa923e283fef91bec6d73bc8" Mar 18 12:13:33 crc kubenswrapper[4921]: I0318 12:13:33.706811 4921 scope.go:117] "RemoveContainer" containerID="f817ff827c299964491a7b5e5be7a57d17082728aa7d25367285cf36a8c84451" Mar 18 12:13:35 crc kubenswrapper[4921]: E0318 12:13:35.348733 4921 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:35 crc kubenswrapper[4921]: I0318 12:13:35.349495 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:35 crc kubenswrapper[4921]: E0318 12:13:35.390284 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee75330f84eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:13:35.389082859 +0000 UTC m=+234.939003538,LastTimestamp:2026-03-18 12:13:35.389082859 +0000 UTC m=+234.939003538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:13:35 crc kubenswrapper[4921]: I0318 12:13:35.634638 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0c0f470f17e71088a4e47ce6b859d8f49be40b8c89c4934c9cef28aad8d08497"} Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.349807 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" containerName="oauth-openshift" containerID="cri-o://f4b91ce515379e77672f494b0a44cb0b6f3407484284b5ac7d9016f3126c151e" gracePeriod=15 Mar 18 12:13:36 crc kubenswrapper[4921]: E0318 12:13:36.587676 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="6.4s" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.646379 4921 generic.go:334] "Generic (PLEG): container finished" podID="7205a33d-ffe1-447c-b1db-756842fcfb4d" containerID="f4b91ce515379e77672f494b0a44cb0b6f3407484284b5ac7d9016f3126c151e" exitCode=0 Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.646486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" event={"ID":"7205a33d-ffe1-447c-b1db-756842fcfb4d","Type":"ContainerDied","Data":"f4b91ce515379e77672f494b0a44cb0b6f3407484284b5ac7d9016f3126c151e"} Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.650175 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6"} Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.651236 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4921]: E0318 12:13:36.651262 4921 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.774914 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.775518 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.775942 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939661 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-session\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939709 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-trusted-ca-bundle\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939756 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-dir\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939776 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-router-certs\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939793 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsgv2\" (UniqueName: \"kubernetes.io/projected/7205a33d-ffe1-447c-b1db-756842fcfb4d-kube-api-access-qsgv2\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939818 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-service-ca\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939837 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-cliconfig\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939856 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-serving-cert\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939878 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-ocp-branding-template\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939894 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-error\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939933 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-idp-0-file-data\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939953 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-policies\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.939986 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-provider-selection\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.940024 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-login\") pod \"7205a33d-ffe1-447c-b1db-756842fcfb4d\" (UID: \"7205a33d-ffe1-447c-b1db-756842fcfb4d\") " Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.940964 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.940979 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941292 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941419 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941545 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941783 4921 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941798 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941810 4921 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7205a33d-ffe1-447c-b1db-756842fcfb4d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941820 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.941830 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.945928 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.946153 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7205a33d-ffe1-447c-b1db-756842fcfb4d-kube-api-access-qsgv2" (OuterVolumeSpecName: "kube-api-access-qsgv2") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "kube-api-access-qsgv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.946183 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.946589 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.950604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.951244 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.951409 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.951905 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:36 crc kubenswrapper[4921]: I0318 12:13:36.952073 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7205a33d-ffe1-447c-b1db-756842fcfb4d" (UID: "7205a33d-ffe1-447c-b1db-756842fcfb4d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043373 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043425 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043445 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043464 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043483 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043503 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043520 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043539 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7205a33d-ffe1-447c-b1db-756842fcfb4d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.043557 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsgv2\" (UniqueName: \"kubernetes.io/projected/7205a33d-ffe1-447c-b1db-756842fcfb4d-kube-api-access-qsgv2\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.658627 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.658607 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" event={"ID":"7205a33d-ffe1-447c-b1db-756842fcfb4d","Type":"ContainerDied","Data":"65183520cf73022ac3bbc6c466a18839e7872463aa50dc9adf3497bacd142d4c"} Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.658831 4921 scope.go:117] "RemoveContainer" containerID="f4b91ce515379e77672f494b0a44cb0b6f3407484284b5ac7d9016f3126c151e" Mar 18 12:13:37 crc kubenswrapper[4921]: E0318 12:13:37.659517 4921 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.659641 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.660037 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.663941 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:37 crc kubenswrapper[4921]: I0318 12:13:37.664453 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:40 crc kubenswrapper[4921]: E0318 12:13:40.187204 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee75330f84eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:13:35.389082859 +0000 UTC m=+234.939003538,LastTimestamp:2026-03-18 12:13:35.389082859 +0000 UTC m=+234.939003538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:13:41 crc kubenswrapper[4921]: I0318 12:13:41.212954 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: I0318 12:13:41.213894 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: E0318 12:13:41.971036 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:41Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:41Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:41Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:13:41Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: E0318 12:13:41.971751 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: E0318 12:13:41.972257 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: E0318 12:13:41.972811 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: E0318 12:13:41.973872 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:41 crc kubenswrapper[4921]: E0318 12:13:41.973916 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:13:42 crc kubenswrapper[4921]: E0318 12:13:42.988569 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="7s" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.209070 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.211298 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.211821 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.226590 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.226619 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:44 crc kubenswrapper[4921]: E0318 12:13:44.226986 4921 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.227554 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:44 crc kubenswrapper[4921]: W0318 12:13:44.247676 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8968a9830e4d7e1ea830343404f863930c0631bdb44268f76cfc7dd579f82e55 WatchSource:0}: Error finding container 8968a9830e4d7e1ea830343404f863930c0631bdb44268f76cfc7dd579f82e55: Status 404 returned error can't find the container with id 8968a9830e4d7e1ea830343404f863930c0631bdb44268f76cfc7dd579f82e55 Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.704492 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.705379 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.705431 4921 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="acadff4bcb285c38774780df7e1790b5ff60121600c87cf90f3e2a464c849d26" exitCode=1 Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.705491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"acadff4bcb285c38774780df7e1790b5ff60121600c87cf90f3e2a464c849d26"} Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.705945 4921 scope.go:117] "RemoveContainer" containerID="acadff4bcb285c38774780df7e1790b5ff60121600c87cf90f3e2a464c849d26" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.707294 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.707541 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.707748 4921 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.708474 4921 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f1e4aa550c55b0c2a1f88456166237b2a198171aeb5e15bbcc125a61833ffcb2" exitCode=0 Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.708563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f1e4aa550c55b0c2a1f88456166237b2a198171aeb5e15bbcc125a61833ffcb2"} Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.708678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8968a9830e4d7e1ea830343404f863930c0631bdb44268f76cfc7dd579f82e55"} Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.709248 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.709274 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.709376 4921 status_manager.go:851] "Failed to get status for pod" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: E0318 12:13:44.709718 4921 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.709937 4921 status_manager.go:851] "Failed to get status for pod" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" pod="openshift-authentication/oauth-openshift-558db77b4-kz82p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz82p\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:44 crc kubenswrapper[4921]: I0318 12:13:44.710405 4921 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.200:6443: connect: connection refused" Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.257980 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.719239 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.719720 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.719812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3595246bdb0c5ea76632568d1dc4eaa711e5eee7eedf4b4ad4e2c4cf697c3916"} Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.723294 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d69773f6b3c36d94f1e114f486d1761b2554beb3abd82260096bc1dfd5ff308f"} Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.723348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"969f77bd430a5148adb5ef298417c0c0c4b505d2879131657c2aa8d8cc097545"} Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.723363 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"519eb86e72a28427d1cd9c17d97c71cf41fa891c57573a2196ceabc7fbe47c0a"} Mar 18 12:13:45 crc kubenswrapper[4921]: I0318 12:13:45.723376 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"747daf53b7c2c3c3ac3a909fb539ea158fd2b17d23581cf58e305f843249b783"} Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.644942 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.652092 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.734438 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"00eefb05b63a9bfc154fba9d83de1f9d992bc751650867361bedb92818e4fca7"} Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.734710 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.735169 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.735320 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:46 crc kubenswrapper[4921]: I0318 12:13:46.735354 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:47 crc kubenswrapper[4921]: I0318 12:13:47.081669 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:13:47 crc kubenswrapper[4921]: I0318 12:13:47.081740 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:13:49 crc kubenswrapper[4921]: I0318 12:13:49.228490 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:49 crc kubenswrapper[4921]: I0318 12:13:49.228884 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:49 crc kubenswrapper[4921]: I0318 12:13:49.233372 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:51 crc kubenswrapper[4921]: I0318 12:13:51.744195 4921 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:51 crc kubenswrapper[4921]: I0318 12:13:51.764578 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:51 crc kubenswrapper[4921]: I0318 12:13:51.764610 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:51 crc kubenswrapper[4921]: I0318 12:13:51.773684 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:13:51 crc kubenswrapper[4921]: I0318 12:13:51.776066 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4874b382-a252-4957-8831-f82951580819" Mar 18 12:13:52 crc kubenswrapper[4921]: I0318 12:13:52.771258 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:52 crc kubenswrapper[4921]: I0318 12:13:52.771306 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:13:55 crc kubenswrapper[4921]: I0318 12:13:55.263190 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:14:01 crc kubenswrapper[4921]: I0318 12:14:01.224076 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4874b382-a252-4957-8831-f82951580819" Mar 18 12:14:02 crc kubenswrapper[4921]: I0318 12:14:02.049516 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:14:02 crc kubenswrapper[4921]: I0318 12:14:02.062709 4921 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:14:02 crc kubenswrapper[4921]: I0318 12:14:02.075180 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:14:02 crc kubenswrapper[4921]: I0318 12:14:02.080052 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:14:02 crc kubenswrapper[4921]: I0318 12:14:02.810474 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:14:02 crc kubenswrapper[4921]: I0318 12:14:02.873306 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.092205 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.094351 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.166281 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.169900 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.172809 4921 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.597515 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.626061 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.635948 4921 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.863235 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:14:03 crc kubenswrapper[4921]: I0318 12:14:03.867875 4921 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.016654 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.111896 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.208046 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.234482 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.459191 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.754866 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.817420 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.890479 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.931787 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:14:04 crc kubenswrapper[4921]: I0318 12:14:04.969224 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.012912 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.054485 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.187310 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.197832 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.199398 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.286430 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.613221 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.656131 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.679911 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.750909 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:14:05 crc kubenswrapper[4921]: I0318 12:14:05.882157 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.002220 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.042234 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.049052 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.059511 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.083848 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.218620 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.393027 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.468710 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.682281 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.709215 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.735420 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.874317 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.976369 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:14:06 crc kubenswrapper[4921]: I0318 12:14:06.994852 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.010220 4921 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.016621 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz82p","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.016720 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-69bf8cf48d-gt58f"] Mar 18 12:14:07 crc kubenswrapper[4921]: E0318 12:14:07.016985 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" containerName="oauth-openshift" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017011 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" containerName="oauth-openshift" Mar 18 12:14:07 crc kubenswrapper[4921]: E0318 12:14:07.017025 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" containerName="installer" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017035 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" containerName="installer" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017185 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" containerName="oauth-openshift" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017203 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57598a3-9521-4697-969f-6c365d74c4e7" containerName="installer" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017471 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017521 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f3975745-b934-4ee7-9835-15eaeb9e2931" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.017738 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.020386 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.021133 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.021190 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.021328 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.021410 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.021663 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.026492 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.031967 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.033816 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.039485 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.048860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38f1b563-8825-4f47-9847-ead9bb56afe9-serving-cert\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.048996 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-config\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.049073 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-proxy-ca-bundles\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.049272 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qqs\" (UniqueName: \"kubernetes.io/projected/38f1b563-8825-4f47-9847-ead9bb56afe9-kube-api-access-b8qqs\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.049362 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-client-ca\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.052087 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.059409 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.059391197 podStartE2EDuration="16.059391197s" podCreationTimestamp="2026-03-18 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:07.058687617 +0000 UTC m=+266.608608306" watchObservedRunningTime="2026-03-18 12:14:07.059391197 +0000 UTC m=+266.609311846" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.074678 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.078557 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.131854 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.150217 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qqs\" (UniqueName: \"kubernetes.io/projected/38f1b563-8825-4f47-9847-ead9bb56afe9-kube-api-access-b8qqs\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.150335 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-client-ca\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.150380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38f1b563-8825-4f47-9847-ead9bb56afe9-serving-cert\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.150417 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-config\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.150446 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-proxy-ca-bundles\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.151798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-client-ca\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.151957 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-proxy-ca-bundles\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.152395 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f1b563-8825-4f47-9847-ead9bb56afe9-config\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.166537 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38f1b563-8825-4f47-9847-ead9bb56afe9-serving-cert\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.168899 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.183661 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qqs\" (UniqueName: \"kubernetes.io/projected/38f1b563-8825-4f47-9847-ead9bb56afe9-kube-api-access-b8qqs\") pod \"controller-manager-69bf8cf48d-gt58f\" (UID: \"38f1b563-8825-4f47-9847-ead9bb56afe9\") " pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.204335 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.214608 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7205a33d-ffe1-447c-b1db-756842fcfb4d" path="/var/lib/kubelet/pods/7205a33d-ffe1-447c-b1db-756842fcfb4d/volumes" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.247986 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.272398 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.344290 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.350080 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.466362 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.539653 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.541303 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.729030 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.764945 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.765191 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.788670 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.793732 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.812482 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.844231 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.884699 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:14:07 crc kubenswrapper[4921]: I0318 12:14:07.923078 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.055701 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.139864 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.145241 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.243831 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.304232 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.343570 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.344466 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.366176 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.374187 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.400948 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.416081 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.462363 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.478309 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.643971 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.662742 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.708810 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.723683 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.756363 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.761598 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.946608 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:14:08 crc kubenswrapper[4921]: I0318 12:14:08.956465 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.033160 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.058073 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.063645 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.070091 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.104170 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.239202 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.340428 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.381943 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.732245 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.755622 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.773325 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.924645 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.926449 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.933539 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:14:09 crc kubenswrapper[4921]: I0318 12:14:09.972398 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.078599 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.126629 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.154464 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.173423 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.174047 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.182454 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.296462 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.375179 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.480857 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.494329 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.610763 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.621688 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.695155 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.824678 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.944520 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.984765 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:14:10 crc kubenswrapper[4921]: I0318 12:14:10.999901 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.007953 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.090608 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.102453 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69bf8cf48d-gt58f"] Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.104182 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.180893 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.242923 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.328972 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.332044 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.367721 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.550916 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.557251 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.625629 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.627034 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.627478 4921 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.654520 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.655852 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.663865 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: E0318 12:14:11.672922 4921 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 12:14:11 crc kubenswrapper[4921]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-69bf8cf48d-gt58f_openshift-controller-manager_38f1b563-8825-4f47-9847-ead9bb56afe9_0(cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc): error adding pod openshift-controller-manager_controller-manager-69bf8cf48d-gt58f to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc" Netns:"/var/run/netns/e6c9e397-d845-43bf-af69-f0564ca54984" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-69bf8cf48d-gt58f;K8S_POD_INFRA_CONTAINER_ID=cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc;K8S_POD_UID=38f1b563-8825-4f47-9847-ead9bb56afe9" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f] networking: Multus: [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f/38f1b563-8825-4f47-9847-ead9bb56afe9]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-69bf8cf48d-gt58f in out of cluster comm: pod "controller-manager-69bf8cf48d-gt58f" not found Mar 18 12:14:11 crc kubenswrapper[4921]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 12:14:11 crc kubenswrapper[4921]: > Mar 18 12:14:11 crc kubenswrapper[4921]: E0318 12:14:11.673011 4921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 12:14:11 crc kubenswrapper[4921]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-69bf8cf48d-gt58f_openshift-controller-manager_38f1b563-8825-4f47-9847-ead9bb56afe9_0(cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc): error adding pod openshift-controller-manager_controller-manager-69bf8cf48d-gt58f to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc" Netns:"/var/run/netns/e6c9e397-d845-43bf-af69-f0564ca54984" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-69bf8cf48d-gt58f;K8S_POD_INFRA_CONTAINER_ID=cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc;K8S_POD_UID=38f1b563-8825-4f47-9847-ead9bb56afe9" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f] networking: Multus: [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f/38f1b563-8825-4f47-9847-ead9bb56afe9]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-69bf8cf48d-gt58f in out of cluster comm: pod "controller-manager-69bf8cf48d-gt58f" not found Mar 18 12:14:11 crc kubenswrapper[4921]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 12:14:11 crc kubenswrapper[4921]: > pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:11 crc kubenswrapper[4921]: E0318 12:14:11.673032 4921 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 12:14:11 crc kubenswrapper[4921]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-69bf8cf48d-gt58f_openshift-controller-manager_38f1b563-8825-4f47-9847-ead9bb56afe9_0(cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc): error adding pod openshift-controller-manager_controller-manager-69bf8cf48d-gt58f to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc" Netns:"/var/run/netns/e6c9e397-d845-43bf-af69-f0564ca54984" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-69bf8cf48d-gt58f;K8S_POD_INFRA_CONTAINER_ID=cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc;K8S_POD_UID=38f1b563-8825-4f47-9847-ead9bb56afe9" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f] networking: Multus: [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f/38f1b563-8825-4f47-9847-ead9bb56afe9]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-69bf8cf48d-gt58f in out of cluster comm: pod "controller-manager-69bf8cf48d-gt58f" not found Mar 18 12:14:11 crc kubenswrapper[4921]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 12:14:11 crc kubenswrapper[4921]: > pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:11 crc kubenswrapper[4921]: E0318 12:14:11.673095 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-69bf8cf48d-gt58f_openshift-controller-manager(38f1b563-8825-4f47-9847-ead9bb56afe9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-69bf8cf48d-gt58f_openshift-controller-manager(38f1b563-8825-4f47-9847-ead9bb56afe9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-69bf8cf48d-gt58f_openshift-controller-manager_38f1b563-8825-4f47-9847-ead9bb56afe9_0(cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc): error adding pod openshift-controller-manager_controller-manager-69bf8cf48d-gt58f to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc\\\" Netns:\\\"/var/run/netns/e6c9e397-d845-43bf-af69-f0564ca54984\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-69bf8cf48d-gt58f;K8S_POD_INFRA_CONTAINER_ID=cebab61e3edf51059a4ce88c02ce96fdd108188ad6f117ddd2f44bd6bea76afc;K8S_POD_UID=38f1b563-8825-4f47-9847-ead9bb56afe9\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f] networking: Multus: [openshift-controller-manager/controller-manager-69bf8cf48d-gt58f/38f1b563-8825-4f47-9847-ead9bb56afe9]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-69bf8cf48d-gt58f in out of cluster comm: pod \\\"controller-manager-69bf8cf48d-gt58f\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" podUID="38f1b563-8825-4f47-9847-ead9bb56afe9" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.682545 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.688173 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.804628 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.831923 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.850824 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.893514 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.893918 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.928976 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.931212 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.953811 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.980144 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:14:11 crc kubenswrapper[4921]: I0318 12:14:11.985207 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.091417 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.121484 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.160557 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.200289 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.243535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.293958 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.296542 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.370063 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.481043 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.496744 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.510872 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.529074 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69bf8cf48d-gt58f"] Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.641753 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.678416 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.900210 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" event={"ID":"38f1b563-8825-4f47-9847-ead9bb56afe9","Type":"ContainerStarted","Data":"f8678627e04735fda8b469461ed4a3cfb2d5f73877b0a115ba74aa48c25536c2"} Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.900601 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.900616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" event={"ID":"38f1b563-8825-4f47-9847-ead9bb56afe9","Type":"ContainerStarted","Data":"d8b6fa9363fa239a6729478e80939d1316780980a330eb136b5065ffc57581ef"} Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.921634 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" podStartSLOduration=45.921609765 podStartE2EDuration="45.921609765s" podCreationTimestamp="2026-03-18 12:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:12.919187494 +0000 UTC m=+272.469108143" watchObservedRunningTime="2026-03-18 12:14:12.921609765 +0000 UTC m=+272.471530404" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.926653 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.926718 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69bf8cf48d-gt58f" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.998061 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:14:12 crc kubenswrapper[4921]: I0318 12:14:12.998066 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.017321 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.077345 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.168977 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.194811 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.237325 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.281557 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563934-mqj9b"] Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.282304 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:13 crc kubenswrapper[4921]: W0318 12:14:13.283953 4921 reflector.go:561] object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49": failed to list *v1.Secret: secrets "csr-approver-sa-dockercfg-cbc49" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-infra": no relationship found between node 'crc' and this object Mar 18 12:14:13 crc kubenswrapper[4921]: E0318 12:14:13.283997 4921 reflector.go:158] "Unhandled Error" err="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-cbc49\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"csr-approver-sa-dockercfg-cbc49\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-infra\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.284018 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.285841 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f9d58f4c-94xhs"] Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.286650 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: W0318 12:14:13.289206 4921 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 18 12:14:13 crc kubenswrapper[4921]: W0318 12:14:13.289223 4921 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 18 12:14:13 crc kubenswrapper[4921]: E0318 12:14:13.289263 4921 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:14:13 crc kubenswrapper[4921]: W0318 12:14:13.289205 4921 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 18 12:14:13 crc kubenswrapper[4921]: E0318 12:14:13.289299 4921 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:14:13 crc kubenswrapper[4921]: E0318 12:14:13.289296 4921 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:14:13 crc kubenswrapper[4921]: W0318 12:14:13.289239 4921 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 18 12:14:13 crc kubenswrapper[4921]: E0318 12:14:13.289356 4921 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.289539 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.289561 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.291479 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.291570 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.291490 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.292080 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.292083 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:14:13 crc kubenswrapper[4921]: W0318 12:14:13.292420 4921 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Mar 18 12:14:13 crc kubenswrapper[4921]: E0318 12:14:13.292467 4921 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.292541 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.292571 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.295958 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.302228 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.310918 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-mqj9b"] Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.318720 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f9d58f4c-94xhs"] Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-session\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453503 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx79k\" (UniqueName: \"kubernetes.io/projected/2bd59b71-2947-4e73-872d-4e84bb7413bc-kube-api-access-dx79k\") pod \"auto-csr-approver-29563934-mqj9b\" (UID: \"2bd59b71-2947-4e73-872d-4e84bb7413bc\") " pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453528 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-error\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453594 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453700 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453752 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-audit-policies\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453797 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-router-certs\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453901 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w77ds\" (UniqueName: \"kubernetes.io/projected/1c224205-4e10-47f8-9128-6cf60385b195-kube-api-access-w77ds\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.453971 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.454016 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c224205-4e10-47f8-9128-6cf60385b195-audit-dir\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.454040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.454072 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-login\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.454147 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-service-ca\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.513397 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.527205 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554626 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-service-ca\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-session\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554715 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx79k\" (UniqueName: \"kubernetes.io/projected/2bd59b71-2947-4e73-872d-4e84bb7413bc-kube-api-access-dx79k\") pod \"auto-csr-approver-29563934-mqj9b\" (UID: \"2bd59b71-2947-4e73-872d-4e84bb7413bc\") " pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-error\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554750 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554781 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554809 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554826 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-audit-policies\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554846 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-router-certs\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554889 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w77ds\" (UniqueName: \"kubernetes.io/projected/1c224205-4e10-47f8-9128-6cf60385b195-kube-api-access-w77ds\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554908 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554933 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c224205-4e10-47f8-9128-6cf60385b195-audit-dir\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.554970 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-login\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.555228 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c224205-4e10-47f8-9128-6cf60385b195-audit-dir\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.555475 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-service-ca\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.555640 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.555768 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.560277 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.560903 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-error\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.561339 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-router-certs\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.567711 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-template-login\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.572786 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.573292 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx79k\" (UniqueName: \"kubernetes.io/projected/2bd59b71-2947-4e73-872d-4e84bb7413bc-kube-api-access-dx79k\") pod \"auto-csr-approver-29563934-mqj9b\" (UID: \"2bd59b71-2947-4e73-872d-4e84bb7413bc\") " pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.574597 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-session\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.617245 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.643314 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.660131 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.670204 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.765732 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.774967 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.810461 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:14:13 crc kubenswrapper[4921]: I0318 12:14:13.865603 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.002792 4921 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.003099 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6" gracePeriod=5 Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.119583 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.137721 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.150003 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.152402 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w77ds\" (UniqueName: \"kubernetes.io/projected/1c224205-4e10-47f8-9128-6cf60385b195-kube-api-access-w77ds\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.158467 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.168633 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.178191 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.224697 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.226481 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c224205-4e10-47f8-9128-6cf60385b195-audit-policies\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.287092 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.315210 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.328341 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.353012 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.362068 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:14:14 crc kubenswrapper[4921]: E0318 12:14:14.556496 4921 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Mar 18 12:14:14 crc kubenswrapper[4921]: E0318 12:14:14.556643 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-ocp-branding-template podName:1c224205-4e10-47f8-9128-6cf60385b195 nodeName:}" failed. No retries permitted until 2026-03-18 12:14:15.056610039 +0000 UTC m=+274.606530718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-f9d58f4c-94xhs" (UID: "1c224205-4e10-47f8-9128-6cf60385b195") : failed to sync secret cache: timed out waiting for the condition Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.568955 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.587188 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.588210 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.605179 4921 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-infra/auto-csr-approver-29563934-mqj9b" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.605250 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.632215 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.632476 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.723170 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.890193 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.930409 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:14:14 crc kubenswrapper[4921]: I0318 12:14:14.931822 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.036927 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-mqj9b"] Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.080735 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.087637 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c224205-4e10-47f8-9128-6cf60385b195-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f9d58f4c-94xhs\" (UID: \"1c224205-4e10-47f8-9128-6cf60385b195\") " pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.111444 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.132989 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.152689 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.235997 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.250123 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.362809 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.465213 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.507305 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.514623 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.562940 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f9d58f4c-94xhs"] Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.607993 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.681327 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.880682 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.904064 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.914180 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" event={"ID":"2bd59b71-2947-4e73-872d-4e84bb7413bc","Type":"ContainerStarted","Data":"0a02c150f2f37073c4ce01f3acbf53161ac037ccba98dc650a69f7163b6864e0"} Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.915937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" event={"ID":"1c224205-4e10-47f8-9128-6cf60385b195","Type":"ContainerStarted","Data":"4c977f39da298bffb05578a0c51fa2f267a418fca26d8cb76caf8bd849629536"} Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.915968 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" event={"ID":"1c224205-4e10-47f8-9128-6cf60385b195","Type":"ContainerStarted","Data":"bfd0e900cf30c30fe6ca0d40c64ea920e21f41bb2812bd75842fb521135f5ddd"} Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.917276 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.923938 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:14:15 crc kubenswrapper[4921]: I0318 12:14:15.939567 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" podStartSLOduration=64.939545454 podStartE2EDuration="1m4.939545454s" podCreationTimestamp="2026-03-18 12:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:15.935087665 +0000 UTC m=+275.485008314" watchObservedRunningTime="2026-03-18 12:14:15.939545454 +0000 UTC m=+275.489466103" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.004477 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.041700 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.106729 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.169881 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.353262 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f9d58f4c-94xhs" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.432472 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.465613 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.500720 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.726247 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.830708 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.923313 4921 generic.go:334] "Generic (PLEG): container finished" podID="2bd59b71-2947-4e73-872d-4e84bb7413bc" containerID="c9613ac5784ba0aab2ba449f1282c59af9fcf8704844927504ca4638bc9d70a6" exitCode=0 Mar 18 12:14:16 crc kubenswrapper[4921]: I0318 12:14:16.923385 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" event={"ID":"2bd59b71-2947-4e73-872d-4e84bb7413bc","Type":"ContainerDied","Data":"c9613ac5784ba0aab2ba449f1282c59af9fcf8704844927504ca4638bc9d70a6"} Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.045863 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.081272 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.081335 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.104819 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.137179 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.594085 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.605206 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:14:17 crc kubenswrapper[4921]: I0318 12:14:17.845899 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.327035 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.433914 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx79k\" (UniqueName: \"kubernetes.io/projected/2bd59b71-2947-4e73-872d-4e84bb7413bc-kube-api-access-dx79k\") pod \"2bd59b71-2947-4e73-872d-4e84bb7413bc\" (UID: \"2bd59b71-2947-4e73-872d-4e84bb7413bc\") " Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.443701 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd59b71-2947-4e73-872d-4e84bb7413bc-kube-api-access-dx79k" (OuterVolumeSpecName: "kube-api-access-dx79k") pod "2bd59b71-2947-4e73-872d-4e84bb7413bc" (UID: "2bd59b71-2947-4e73-872d-4e84bb7413bc"). InnerVolumeSpecName "kube-api-access-dx79k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.457420 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.535573 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx79k\" (UniqueName: \"kubernetes.io/projected/2bd59b71-2947-4e73-872d-4e84bb7413bc-kube-api-access-dx79k\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.941920 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" event={"ID":"2bd59b71-2947-4e73-872d-4e84bb7413bc","Type":"ContainerDied","Data":"0a02c150f2f37073c4ce01f3acbf53161ac037ccba98dc650a69f7163b6864e0"} Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.941955 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a02c150f2f37073c4ce01f3acbf53161ac037ccba98dc650a69f7163b6864e0" Mar 18 12:14:18 crc kubenswrapper[4921]: I0318 12:14:18.942021 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-mqj9b" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.176342 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.586433 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.586528 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.748995 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749059 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749073 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749134 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749171 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749193 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749231 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749306 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749364 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749488 4921 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749507 4921 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749522 4921 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.749536 4921 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.751509 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.757261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.850423 4921 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.948163 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.948226 4921 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6" exitCode=137 Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.948281 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.948321 4921 scope.go:117] "RemoveContainer" containerID="d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.978984 4921 scope.go:117] "RemoveContainer" containerID="d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6" Mar 18 12:14:19 crc kubenswrapper[4921]: E0318 12:14:19.979533 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6\": container with ID starting with d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6 not found: ID does not exist" containerID="d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6" Mar 18 12:14:19 crc kubenswrapper[4921]: I0318 12:14:19.979578 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6"} err="failed to get container status \"d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6\": rpc error: code = NotFound desc = could not find container \"d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6\": container with ID starting with d4d753a3111d20a6cebabc15d63a1b781af6a4ff91265a693d914acc607064e6 not found: ID does not exist" Mar 18 12:14:21 crc kubenswrapper[4921]: I0318 12:14:21.218367 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 12:14:47 crc kubenswrapper[4921]: I0318 12:14:47.081073 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:14:47 crc kubenswrapper[4921]: I0318 12:14:47.081847 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:14:47 crc kubenswrapper[4921]: I0318 12:14:47.081897 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:14:47 crc kubenswrapper[4921]: I0318 12:14:47.082543 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:14:47 crc kubenswrapper[4921]: I0318 12:14:47.082611 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9" gracePeriod=600 Mar 18 12:14:48 crc kubenswrapper[4921]: I0318 12:14:48.130941 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9" exitCode=0 Mar 18 12:14:48 crc kubenswrapper[4921]: I0318 12:14:48.131049 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9"} Mar 18 12:14:48 crc kubenswrapper[4921]: I0318 12:14:48.131334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"90c2627d76f56111b8091bf7cf164f6179d0606e616e7e916cb477a7a9cb4d04"} Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.135202 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8"] Mar 18 12:15:00 crc kubenswrapper[4921]: E0318 12:15:00.136028 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd59b71-2947-4e73-872d-4e84bb7413bc" containerName="oc" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.136051 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd59b71-2947-4e73-872d-4e84bb7413bc" containerName="oc" Mar 18 12:15:00 crc kubenswrapper[4921]: E0318 12:15:00.136080 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.136090 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.136241 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd59b71-2947-4e73-872d-4e84bb7413bc" containerName="oc" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.136258 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.136687 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.139619 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.144929 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.146379 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8"] Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.218188 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56299bf-8e89-4425-914d-1aa2f0e635af-config-volume\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.218357 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c56299bf-8e89-4425-914d-1aa2f0e635af-secret-volume\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.218406 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg9zh\" (UniqueName: \"kubernetes.io/projected/c56299bf-8e89-4425-914d-1aa2f0e635af-kube-api-access-vg9zh\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.319169 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56299bf-8e89-4425-914d-1aa2f0e635af-config-volume\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.319246 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c56299bf-8e89-4425-914d-1aa2f0e635af-secret-volume\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.319289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg9zh\" (UniqueName: \"kubernetes.io/projected/c56299bf-8e89-4425-914d-1aa2f0e635af-kube-api-access-vg9zh\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.320297 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56299bf-8e89-4425-914d-1aa2f0e635af-config-volume\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.325856 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c56299bf-8e89-4425-914d-1aa2f0e635af-secret-volume\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.336706 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg9zh\" (UniqueName: \"kubernetes.io/projected/c56299bf-8e89-4425-914d-1aa2f0e635af-kube-api-access-vg9zh\") pod \"collect-profiles-29563935-slzc8\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.459182 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:00 crc kubenswrapper[4921]: I0318 12:15:00.844173 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8"] Mar 18 12:15:01 crc kubenswrapper[4921]: I0318 12:15:01.201037 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" event={"ID":"c56299bf-8e89-4425-914d-1aa2f0e635af","Type":"ContainerStarted","Data":"a6a8c982494967ed14e25356adf187fd170f478d67715573db095b4a97df4625"} Mar 18 12:15:02 crc kubenswrapper[4921]: I0318 12:15:02.209329 4921 generic.go:334] "Generic (PLEG): container finished" podID="c56299bf-8e89-4425-914d-1aa2f0e635af" containerID="9fcf4697e2c7d18c089b8035cad2591e0554412a34d6f3af359fff72d5505a16" exitCode=0 Mar 18 12:15:02 crc kubenswrapper[4921]: I0318 12:15:02.209598 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" event={"ID":"c56299bf-8e89-4425-914d-1aa2f0e635af","Type":"ContainerDied","Data":"9fcf4697e2c7d18c089b8035cad2591e0554412a34d6f3af359fff72d5505a16"} Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.500193 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.559852 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c56299bf-8e89-4425-914d-1aa2f0e635af-secret-volume\") pod \"c56299bf-8e89-4425-914d-1aa2f0e635af\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.559921 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56299bf-8e89-4425-914d-1aa2f0e635af-config-volume\") pod \"c56299bf-8e89-4425-914d-1aa2f0e635af\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.559987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg9zh\" (UniqueName: \"kubernetes.io/projected/c56299bf-8e89-4425-914d-1aa2f0e635af-kube-api-access-vg9zh\") pod \"c56299bf-8e89-4425-914d-1aa2f0e635af\" (UID: \"c56299bf-8e89-4425-914d-1aa2f0e635af\") " Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.561142 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56299bf-8e89-4425-914d-1aa2f0e635af-config-volume" (OuterVolumeSpecName: "config-volume") pod "c56299bf-8e89-4425-914d-1aa2f0e635af" (UID: "c56299bf-8e89-4425-914d-1aa2f0e635af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.566317 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56299bf-8e89-4425-914d-1aa2f0e635af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c56299bf-8e89-4425-914d-1aa2f0e635af" (UID: "c56299bf-8e89-4425-914d-1aa2f0e635af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.567411 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56299bf-8e89-4425-914d-1aa2f0e635af-kube-api-access-vg9zh" (OuterVolumeSpecName: "kube-api-access-vg9zh") pod "c56299bf-8e89-4425-914d-1aa2f0e635af" (UID: "c56299bf-8e89-4425-914d-1aa2f0e635af"). InnerVolumeSpecName "kube-api-access-vg9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.661272 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg9zh\" (UniqueName: \"kubernetes.io/projected/c56299bf-8e89-4425-914d-1aa2f0e635af-kube-api-access-vg9zh\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.661310 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c56299bf-8e89-4425-914d-1aa2f0e635af-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:03 crc kubenswrapper[4921]: I0318 12:15:03.661320 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c56299bf-8e89-4425-914d-1aa2f0e635af-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:04 crc kubenswrapper[4921]: I0318 12:15:04.235022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" event={"ID":"c56299bf-8e89-4425-914d-1aa2f0e635af","Type":"ContainerDied","Data":"a6a8c982494967ed14e25356adf187fd170f478d67715573db095b4a97df4625"} Mar 18 12:15:04 crc kubenswrapper[4921]: I0318 12:15:04.235063 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6a8c982494967ed14e25356adf187fd170f478d67715573db095b4a97df4625" Mar 18 12:15:04 crc kubenswrapper[4921]: I0318 12:15:04.235137 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.768377 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdd82"] Mar 18 12:15:15 crc kubenswrapper[4921]: E0318 12:15:15.769231 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56299bf-8e89-4425-914d-1aa2f0e635af" containerName="collect-profiles" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.769245 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56299bf-8e89-4425-914d-1aa2f0e635af" containerName="collect-profiles" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.769354 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56299bf-8e89-4425-914d-1aa2f0e635af" containerName="collect-profiles" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.769709 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.825771 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a300d26a-0793-463c-b8c7-86d1a02ffb49-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.825843 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a300d26a-0793-463c-b8c7-86d1a02ffb49-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.825878 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a300d26a-0793-463c-b8c7-86d1a02ffb49-registry-certificates\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.825917 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.826015 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-registry-tls\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.826238 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-bound-sa-token\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.826325 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6kk\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-kube-api-access-jw6kk\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.826443 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a300d26a-0793-463c-b8c7-86d1a02ffb49-trusted-ca\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.830630 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdd82"] Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.895055 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a300d26a-0793-463c-b8c7-86d1a02ffb49-registry-certificates\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927710 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-registry-tls\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927738 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-bound-sa-token\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927764 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6kk\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-kube-api-access-jw6kk\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927800 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a300d26a-0793-463c-b8c7-86d1a02ffb49-trusted-ca\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927832 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a300d26a-0793-463c-b8c7-86d1a02ffb49-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.927863 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a300d26a-0793-463c-b8c7-86d1a02ffb49-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.928678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a300d26a-0793-463c-b8c7-86d1a02ffb49-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.929204 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a300d26a-0793-463c-b8c7-86d1a02ffb49-trusted-ca\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.929523 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a300d26a-0793-463c-b8c7-86d1a02ffb49-registry-certificates\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.934642 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a300d26a-0793-463c-b8c7-86d1a02ffb49-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.937874 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-registry-tls\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.947672 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6kk\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-kube-api-access-jw6kk\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:15 crc kubenswrapper[4921]: I0318 12:15:15.947629 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a300d26a-0793-463c-b8c7-86d1a02ffb49-bound-sa-token\") pod \"image-registry-66df7c8f76-jdd82\" (UID: \"a300d26a-0793-463c-b8c7-86d1a02ffb49\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:16 crc kubenswrapper[4921]: I0318 12:15:16.096545 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:16 crc kubenswrapper[4921]: I0318 12:15:16.517528 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdd82"] Mar 18 12:15:17 crc kubenswrapper[4921]: I0318 12:15:17.328449 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" event={"ID":"a300d26a-0793-463c-b8c7-86d1a02ffb49","Type":"ContainerStarted","Data":"84e8fb2eb9d822f5309c9873455eae4e686b2e395df1bdd53c8a3bcc736ae508"} Mar 18 12:15:17 crc kubenswrapper[4921]: I0318 12:15:17.329169 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:17 crc kubenswrapper[4921]: I0318 12:15:17.329197 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" event={"ID":"a300d26a-0793-463c-b8c7-86d1a02ffb49","Type":"ContainerStarted","Data":"1b1c36537998b5c18961d9b1c7d822ac41055dfe8c6809281e66b5b8faf9dbf4"} Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.560369 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" podStartSLOduration=20.560349088 podStartE2EDuration="20.560349088s" podCreationTimestamp="2026-03-18 12:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:17.36339027 +0000 UTC m=+336.913310919" watchObservedRunningTime="2026-03-18 12:15:35.560349088 +0000 UTC m=+355.110269737" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.567526 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gvwb"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.567884 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8gvwb" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="registry-server" containerID="cri-o://e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965" gracePeriod=30 Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.573184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4c8k6"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.573436 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4c8k6" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="registry-server" containerID="cri-o://226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310" gracePeriod=30 Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.587396 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4lm"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.587690 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" podUID="eac88057-b7cd-4264-861c-b7d53340338d" containerName="marketplace-operator" containerID="cri-o://99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467" gracePeriod=30 Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.594268 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshr9"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.594767 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dshr9" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="registry-server" containerID="cri-o://ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2" gracePeriod=30 Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.596542 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pf9vr"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.596697 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pf9vr" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="registry-server" containerID="cri-o://becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b" gracePeriod=30 Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.613849 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mbf4l"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.618479 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mbf4l"] Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.618569 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.655471 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwb2\" (UniqueName: \"kubernetes.io/projected/7908e95d-6c90-41f8-924d-072fd69e70c6-kube-api-access-bjwb2\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.655548 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7908e95d-6c90-41f8-924d-072fd69e70c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.655580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7908e95d-6c90-41f8-924d-072fd69e70c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.757229 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7908e95d-6c90-41f8-924d-072fd69e70c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.757303 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7908e95d-6c90-41f8-924d-072fd69e70c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.757362 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwb2\" (UniqueName: \"kubernetes.io/projected/7908e95d-6c90-41f8-924d-072fd69e70c6-kube-api-access-bjwb2\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.759784 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7908e95d-6c90-41f8-924d-072fd69e70c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.768734 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7908e95d-6c90-41f8-924d-072fd69e70c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.774452 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwb2\" (UniqueName: \"kubernetes.io/projected/7908e95d-6c90-41f8-924d-072fd69e70c6-kube-api-access-bjwb2\") pod \"marketplace-operator-79b997595-mbf4l\" (UID: \"7908e95d-6c90-41f8-924d-072fd69e70c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:35 crc kubenswrapper[4921]: I0318 12:15:35.969562 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.029385 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.060242 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jft5x\" (UniqueName: \"kubernetes.io/projected/125cafaf-afed-45eb-b6c9-0f06ee2637ec-kube-api-access-jft5x\") pod \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.060295 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-utilities\") pod \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.060372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-catalog-content\") pod \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\" (UID: \"125cafaf-afed-45eb-b6c9-0f06ee2637ec\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.062411 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-utilities" (OuterVolumeSpecName: "utilities") pod "125cafaf-afed-45eb-b6c9-0f06ee2637ec" (UID: "125cafaf-afed-45eb-b6c9-0f06ee2637ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.064278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125cafaf-afed-45eb-b6c9-0f06ee2637ec-kube-api-access-jft5x" (OuterVolumeSpecName: "kube-api-access-jft5x") pod "125cafaf-afed-45eb-b6c9-0f06ee2637ec" (UID: "125cafaf-afed-45eb-b6c9-0f06ee2637ec"). InnerVolumeSpecName "kube-api-access-jft5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.092604 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.097367 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.104302 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jdd82" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.105924 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.135922 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161669 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-trusted-ca\") pod \"eac88057-b7cd-4264-861c-b7d53340338d\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161719 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-utilities\") pod \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161745 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-utilities\") pod \"63c09902-e057-4d3a-811f-e068f2ebe716\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161771 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr7vf\" (UniqueName: \"kubernetes.io/projected/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-kube-api-access-tr7vf\") pod \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161803 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-utilities\") pod \"09b95848-38ec-4890-9cd2-83bc2e137c4a\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161820 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/09b95848-38ec-4890-9cd2-83bc2e137c4a-kube-api-access-qx4fw\") pod \"09b95848-38ec-4890-9cd2-83bc2e137c4a\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161892 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-catalog-content\") pod \"09b95848-38ec-4890-9cd2-83bc2e137c4a\" (UID: \"09b95848-38ec-4890-9cd2-83bc2e137c4a\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161924 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcfht\" (UniqueName: \"kubernetes.io/projected/eac88057-b7cd-4264-861c-b7d53340338d-kube-api-access-lcfht\") pod \"eac88057-b7cd-4264-861c-b7d53340338d\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.161988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-catalog-content\") pod \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\" (UID: \"b49f7bf4-ce72-4b66-8d0e-b2061d228a58\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162006 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtq8n\" (UniqueName: \"kubernetes.io/projected/63c09902-e057-4d3a-811f-e068f2ebe716-kube-api-access-wtq8n\") pod \"63c09902-e057-4d3a-811f-e068f2ebe716\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162192 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "eac88057-b7cd-4264-861c-b7d53340338d" (UID: "eac88057-b7cd-4264-861c-b7d53340338d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162223 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-catalog-content\") pod \"63c09902-e057-4d3a-811f-e068f2ebe716\" (UID: \"63c09902-e057-4d3a-811f-e068f2ebe716\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162394 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-operator-metrics\") pod \"eac88057-b7cd-4264-861c-b7d53340338d\" (UID: \"eac88057-b7cd-4264-861c-b7d53340338d\") " Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162805 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jft5x\" (UniqueName: \"kubernetes.io/projected/125cafaf-afed-45eb-b6c9-0f06ee2637ec-kube-api-access-jft5x\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162818 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.162828 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.167347 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-kube-api-access-tr7vf" (OuterVolumeSpecName: "kube-api-access-tr7vf") pod "b49f7bf4-ce72-4b66-8d0e-b2061d228a58" (UID: "b49f7bf4-ce72-4b66-8d0e-b2061d228a58"). InnerVolumeSpecName "kube-api-access-tr7vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.167711 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "125cafaf-afed-45eb-b6c9-0f06ee2637ec" (UID: "125cafaf-afed-45eb-b6c9-0f06ee2637ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.170656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-utilities" (OuterVolumeSpecName: "utilities") pod "09b95848-38ec-4890-9cd2-83bc2e137c4a" (UID: "09b95848-38ec-4890-9cd2-83bc2e137c4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.170675 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-utilities" (OuterVolumeSpecName: "utilities") pod "63c09902-e057-4d3a-811f-e068f2ebe716" (UID: "63c09902-e057-4d3a-811f-e068f2ebe716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.176810 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b95848-38ec-4890-9cd2-83bc2e137c4a-kube-api-access-qx4fw" (OuterVolumeSpecName: "kube-api-access-qx4fw") pod "09b95848-38ec-4890-9cd2-83bc2e137c4a" (UID: "09b95848-38ec-4890-9cd2-83bc2e137c4a"). InnerVolumeSpecName "kube-api-access-qx4fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.176996 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c09902-e057-4d3a-811f-e068f2ebe716-kube-api-access-wtq8n" (OuterVolumeSpecName: "kube-api-access-wtq8n") pod "63c09902-e057-4d3a-811f-e068f2ebe716" (UID: "63c09902-e057-4d3a-811f-e068f2ebe716"). InnerVolumeSpecName "kube-api-access-wtq8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.177088 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-utilities" (OuterVolumeSpecName: "utilities") pod "b49f7bf4-ce72-4b66-8d0e-b2061d228a58" (UID: "b49f7bf4-ce72-4b66-8d0e-b2061d228a58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.177262 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "eac88057-b7cd-4264-861c-b7d53340338d" (UID: "eac88057-b7cd-4264-861c-b7d53340338d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.181881 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac88057-b7cd-4264-861c-b7d53340338d-kube-api-access-lcfht" (OuterVolumeSpecName: "kube-api-access-lcfht") pod "eac88057-b7cd-4264-861c-b7d53340338d" (UID: "eac88057-b7cd-4264-861c-b7d53340338d"). InnerVolumeSpecName "kube-api-access-lcfht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.209069 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x4bzs"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.248328 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b49f7bf4-ce72-4b66-8d0e-b2061d228a58" (UID: "b49f7bf4-ce72-4b66-8d0e-b2061d228a58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263201 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63c09902-e057-4d3a-811f-e068f2ebe716" (UID: "63c09902-e057-4d3a-811f-e068f2ebe716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263924 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eac88057-b7cd-4264-861c-b7d53340338d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263946 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263955 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263963 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr7vf\" (UniqueName: \"kubernetes.io/projected/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-kube-api-access-tr7vf\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263972 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263980 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/09b95848-38ec-4890-9cd2-83bc2e137c4a-kube-api-access-qx4fw\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263988 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcfht\" (UniqueName: \"kubernetes.io/projected/eac88057-b7cd-4264-861c-b7d53340338d-kube-api-access-lcfht\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.263997 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125cafaf-afed-45eb-b6c9-0f06ee2637ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.264005 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49f7bf4-ce72-4b66-8d0e-b2061d228a58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.264014 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtq8n\" (UniqueName: \"kubernetes.io/projected/63c09902-e057-4d3a-811f-e068f2ebe716-kube-api-access-wtq8n\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.264021 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09902-e057-4d3a-811f-e068f2ebe716-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.391404 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09b95848-38ec-4890-9cd2-83bc2e137c4a" (UID: "09b95848-38ec-4890-9cd2-83bc2e137c4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.449199 4921 generic.go:334] "Generic (PLEG): container finished" podID="63c09902-e057-4d3a-811f-e068f2ebe716" containerID="e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965" exitCode=0 Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.449260 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gvwb" event={"ID":"63c09902-e057-4d3a-811f-e068f2ebe716","Type":"ContainerDied","Data":"e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.449288 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gvwb" event={"ID":"63c09902-e057-4d3a-811f-e068f2ebe716","Type":"ContainerDied","Data":"e0746dd28698e000029ab7a69d72a00590eaa3dd27cf201557599b3d2d7f368c"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.449316 4921 scope.go:117] "RemoveContainer" containerID="e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.449455 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gvwb" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.462916 4921 generic.go:334] "Generic (PLEG): container finished" podID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerID="226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310" exitCode=0 Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.462996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8k6" event={"ID":"125cafaf-afed-45eb-b6c9-0f06ee2637ec","Type":"ContainerDied","Data":"226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.463029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c8k6" event={"ID":"125cafaf-afed-45eb-b6c9-0f06ee2637ec","Type":"ContainerDied","Data":"6e25cc082e6276b765f38f441f64809af71c7140d00376690fcedc0c9f75803d"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.463126 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c8k6" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.468125 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b95848-38ec-4890-9cd2-83bc2e137c4a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.468558 4921 generic.go:334] "Generic (PLEG): container finished" podID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerID="becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b" exitCode=0 Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.468663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf9vr" event={"ID":"09b95848-38ec-4890-9cd2-83bc2e137c4a","Type":"ContainerDied","Data":"becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.468696 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pf9vr" event={"ID":"09b95848-38ec-4890-9cd2-83bc2e137c4a","Type":"ContainerDied","Data":"889651df45aa4e220583ea0d29f61ed8f626e9a44c22323c32444ca98a71f018"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.468800 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pf9vr" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.470388 4921 generic.go:334] "Generic (PLEG): container finished" podID="eac88057-b7cd-4264-861c-b7d53340338d" containerID="99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467" exitCode=0 Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.470424 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.470464 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" event={"ID":"eac88057-b7cd-4264-861c-b7d53340338d","Type":"ContainerDied","Data":"99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.470490 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4lm" event={"ID":"eac88057-b7cd-4264-861c-b7d53340338d","Type":"ContainerDied","Data":"5475247ab74b8b063f1543e97a2653e0e958804ae4440787352871d8ca68e103"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.471928 4921 generic.go:334] "Generic (PLEG): container finished" podID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerID="ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2" exitCode=0 Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.472068 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshr9" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.472045 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerDied","Data":"ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.472977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshr9" event={"ID":"b49f7bf4-ce72-4b66-8d0e-b2061d228a58","Type":"ContainerDied","Data":"79b4011e68d50e464bb565e35ff3fd2a2f9dc9e241abb66319d3a415a0fdf409"} Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.483127 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gvwb"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.483179 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8gvwb"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.488265 4921 scope.go:117] "RemoveContainer" containerID="7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.511069 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4c8k6"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.515061 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4c8k6"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.521316 4921 scope.go:117] "RemoveContainer" containerID="7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.522189 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mbf4l"] Mar 18 12:15:36 crc kubenswrapper[4921]: W0318 12:15:36.526609 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7908e95d_6c90_41f8_924d_072fd69e70c6.slice/crio-02c19db5a35eb73ce36e79bf85fd37ae7097ac137cdbddac904fbaf3915681fb WatchSource:0}: Error finding container 02c19db5a35eb73ce36e79bf85fd37ae7097ac137cdbddac904fbaf3915681fb: Status 404 returned error can't find the container with id 02c19db5a35eb73ce36e79bf85fd37ae7097ac137cdbddac904fbaf3915681fb Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.535245 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4lm"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.542265 4921 scope.go:117] "RemoveContainer" containerID="e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.542315 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4lm"] Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.542777 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965\": container with ID starting with e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965 not found: ID does not exist" containerID="e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.542809 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965"} err="failed to get container status \"e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965\": rpc error: code = NotFound desc = could not find container \"e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965\": container with ID starting with e825be27d00325edf33d283beb29861b508a4286076ce0a4fa7de356c495a965 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.542830 4921 scope.go:117] "RemoveContainer" containerID="7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.543030 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1\": container with ID starting with 7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1 not found: ID does not exist" containerID="7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.543051 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1"} err="failed to get container status \"7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1\": rpc error: code = NotFound desc = could not find container \"7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1\": container with ID starting with 7d2f017e9d30ec5640066a5f509df0362d1abcb9563c3f1f0864ee1706fb29c1 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.543066 4921 scope.go:117] "RemoveContainer" containerID="7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.543376 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055\": container with ID starting with 7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055 not found: ID does not exist" containerID="7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.543421 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055"} err="failed to get container status \"7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055\": rpc error: code = NotFound desc = could not find container \"7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055\": container with ID starting with 7a6034ba5e7a0865b7dd32046754ed87ac7d478d3ee98fdfaa829aea06120055 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.543444 4921 scope.go:117] "RemoveContainer" containerID="226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.546473 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshr9"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.549938 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshr9"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.568788 4921 scope.go:117] "RemoveContainer" containerID="5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.569828 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pf9vr"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.573298 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pf9vr"] Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.584166 4921 scope.go:117] "RemoveContainer" containerID="ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.614574 4921 scope.go:117] "RemoveContainer" containerID="226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.615039 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310\": container with ID starting with 226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310 not found: ID does not exist" containerID="226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.615067 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310"} err="failed to get container status \"226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310\": rpc error: code = NotFound desc = could not find container \"226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310\": container with ID starting with 226275ba7af8264ebcdcfa7ac2688a3b5a8d7dad89565fce9f675daa646c0310 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.615090 4921 scope.go:117] "RemoveContainer" containerID="5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.617004 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e\": container with ID starting with 5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e not found: ID does not exist" containerID="5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.617060 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e"} err="failed to get container status \"5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e\": rpc error: code = NotFound desc = could not find container \"5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e\": container with ID starting with 5e6ef49b9b705784b71014706a7340efb11f17d613de33bda24825e82958d61e not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.617094 4921 scope.go:117] "RemoveContainer" containerID="ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.617835 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0\": container with ID starting with ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0 not found: ID does not exist" containerID="ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.617863 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0"} err="failed to get container status \"ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0\": rpc error: code = NotFound desc = could not find container \"ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0\": container with ID starting with ae5ee8f99ea9549a5ed9651b14297c03101d67ba0f4c9c34270f929123e046b0 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.617882 4921 scope.go:117] "RemoveContainer" containerID="becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.640609 4921 scope.go:117] "RemoveContainer" containerID="b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.657719 4921 scope.go:117] "RemoveContainer" containerID="80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.678540 4921 scope.go:117] "RemoveContainer" containerID="becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.679026 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b\": container with ID starting with becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b not found: ID does not exist" containerID="becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.679073 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b"} err="failed to get container status \"becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b\": rpc error: code = NotFound desc = could not find container \"becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b\": container with ID starting with becaf42b6d17418fe8a1510aaafbe818bd321fd640323fd99434f3dc4ec11c1b not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.679168 4921 scope.go:117] "RemoveContainer" containerID="b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.679671 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939\": container with ID starting with b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939 not found: ID does not exist" containerID="b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.679696 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939"} err="failed to get container status \"b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939\": rpc error: code = NotFound desc = could not find container \"b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939\": container with ID starting with b94cab7e2f31915b1f0cfdae10228764d564193831dca1128578010a7d271939 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.679722 4921 scope.go:117] "RemoveContainer" containerID="80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.680241 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50\": container with ID starting with 80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50 not found: ID does not exist" containerID="80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.680295 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50"} err="failed to get container status \"80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50\": rpc error: code = NotFound desc = could not find container \"80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50\": container with ID starting with 80adcd0ccc85199654db2b8bcd6d764d6d7a43b11d4db37b6aa2f13f314e2c50 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.680320 4921 scope.go:117] "RemoveContainer" containerID="99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.699896 4921 scope.go:117] "RemoveContainer" containerID="99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.700511 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467\": container with ID starting with 99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467 not found: ID does not exist" containerID="99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.700589 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467"} err="failed to get container status \"99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467\": rpc error: code = NotFound desc = could not find container \"99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467\": container with ID starting with 99308d2a6178adc1d75f85378d086eb40ba43f3dfbb5a5e9fbd382a6d08cc467 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.700635 4921 scope.go:117] "RemoveContainer" containerID="ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.713935 4921 scope.go:117] "RemoveContainer" containerID="78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.728414 4921 scope.go:117] "RemoveContainer" containerID="448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.773591 4921 scope.go:117] "RemoveContainer" containerID="ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.773979 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2\": container with ID starting with ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2 not found: ID does not exist" containerID="ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.774022 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2"} err="failed to get container status \"ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2\": rpc error: code = NotFound desc = could not find container \"ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2\": container with ID starting with ded98d11141ed845891b8ebaf37ad9a73d731132763d142a75d4de485997f1d2 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.774059 4921 scope.go:117] "RemoveContainer" containerID="78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.774431 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2\": container with ID starting with 78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2 not found: ID does not exist" containerID="78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.774461 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2"} err="failed to get container status \"78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2\": rpc error: code = NotFound desc = could not find container \"78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2\": container with ID starting with 78d163d9dd4d25d0be8f28d74582583b583a722bdfb7d8f20cf15cd4de2505f2 not found: ID does not exist" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.774484 4921 scope.go:117] "RemoveContainer" containerID="448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297" Mar 18 12:15:36 crc kubenswrapper[4921]: E0318 12:15:36.774766 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297\": container with ID starting with 448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297 not found: ID does not exist" containerID="448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297" Mar 18 12:15:36 crc kubenswrapper[4921]: I0318 12:15:36.774787 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297"} err="failed to get container status \"448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297\": rpc error: code = NotFound desc = could not find container \"448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297\": container with ID starting with 448512dc7701b7415db493e30fbf1d7ba6337f557ee82adf74035132b2bab297 not found: ID does not exist" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.226318 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" path="/var/lib/kubelet/pods/09b95848-38ec-4890-9cd2-83bc2e137c4a/volumes" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.227370 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" path="/var/lib/kubelet/pods/125cafaf-afed-45eb-b6c9-0f06ee2637ec/volumes" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.228301 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" path="/var/lib/kubelet/pods/63c09902-e057-4d3a-811f-e068f2ebe716/volumes" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.229569 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" path="/var/lib/kubelet/pods/b49f7bf4-ce72-4b66-8d0e-b2061d228a58/volumes" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.230344 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac88057-b7cd-4264-861c-b7d53340338d" path="/var/lib/kubelet/pods/eac88057-b7cd-4264-861c-b7d53340338d/volumes" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383327 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dk9qh"] Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383567 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383582 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383594 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383603 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383615 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383623 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383634 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383642 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383652 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383659 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383673 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac88057-b7cd-4264-861c-b7d53340338d" containerName="marketplace-operator" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383682 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac88057-b7cd-4264-861c-b7d53340338d" containerName="marketplace-operator" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383696 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383704 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383714 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383722 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383730 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383738 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383749 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383756 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="extract-utilities" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383768 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383777 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383787 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383795 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: E0318 12:15:37.383804 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383811 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="extract-content" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383925 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c09902-e057-4d3a-811f-e068f2ebe716" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383940 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49f7bf4-ce72-4b66-8d0e-b2061d228a58" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383956 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b95848-38ec-4890-9cd2-83bc2e137c4a" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383968 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="125cafaf-afed-45eb-b6c9-0f06ee2637ec" containerName="registry-server" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.383979 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac88057-b7cd-4264-861c-b7d53340338d" containerName="marketplace-operator" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.388123 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.395432 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.414087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dk9qh"] Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.477866 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" event={"ID":"7908e95d-6c90-41f8-924d-072fd69e70c6","Type":"ContainerStarted","Data":"8a92774c45e73897be68c20274bced4c94a874a0c53bd23395f7e9109bb13adf"} Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.477904 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" event={"ID":"7908e95d-6c90-41f8-924d-072fd69e70c6","Type":"ContainerStarted","Data":"02c19db5a35eb73ce36e79bf85fd37ae7097ac137cdbddac904fbaf3915681fb"} Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.478860 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.479748 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-catalog-content\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.479786 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-utilities\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.479840 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxl9m\" (UniqueName: \"kubernetes.io/projected/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-kube-api-access-dxl9m\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.484514 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.499202 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mbf4l" podStartSLOduration=2.499187951 podStartE2EDuration="2.499187951s" podCreationTimestamp="2026-03-18 12:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:37.496140034 +0000 UTC m=+357.046060683" watchObservedRunningTime="2026-03-18 12:15:37.499187951 +0000 UTC m=+357.049108590" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.581266 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-catalog-content\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.581352 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-utilities\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.581394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxl9m\" (UniqueName: \"kubernetes.io/projected/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-kube-api-access-dxl9m\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.581766 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-catalog-content\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.581977 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-utilities\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.602943 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxl9m\" (UniqueName: \"kubernetes.io/projected/1dfe7c23-1893-4da0-a65b-dfb1c0da89e8-kube-api-access-dxl9m\") pod \"certified-operators-dk9qh\" (UID: \"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8\") " pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:37 crc kubenswrapper[4921]: I0318 12:15:37.713479 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.154587 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dk9qh"] Mar 18 12:15:38 crc kubenswrapper[4921]: W0318 12:15:38.164656 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dfe7c23_1893_4da0_a65b_dfb1c0da89e8.slice/crio-00ff5e6ba5b1df0dbcd8fc4ff8a38b70ca9b5ae909708e29ef4c1708b9dbc007 WatchSource:0}: Error finding container 00ff5e6ba5b1df0dbcd8fc4ff8a38b70ca9b5ae909708e29ef4c1708b9dbc007: Status 404 returned error can't find the container with id 00ff5e6ba5b1df0dbcd8fc4ff8a38b70ca9b5ae909708e29ef4c1708b9dbc007 Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.382378 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p4d9q"] Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.383709 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.386041 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.388452 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4d9q"] Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.392226 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edcf939-f4bf-4aba-84fe-8e44a12fac21-utilities\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.392283 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edcf939-f4bf-4aba-84fe-8e44a12fac21-catalog-content\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.392313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6rzm\" (UniqueName: \"kubernetes.io/projected/0edcf939-f4bf-4aba-84fe-8e44a12fac21-kube-api-access-v6rzm\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.492864 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edcf939-f4bf-4aba-84fe-8e44a12fac21-utilities\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.492905 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edcf939-f4bf-4aba-84fe-8e44a12fac21-catalog-content\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.492921 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6rzm\" (UniqueName: \"kubernetes.io/projected/0edcf939-f4bf-4aba-84fe-8e44a12fac21-kube-api-access-v6rzm\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.493379 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edcf939-f4bf-4aba-84fe-8e44a12fac21-catalog-content\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.493510 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edcf939-f4bf-4aba-84fe-8e44a12fac21-utilities\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.506241 4921 generic.go:334] "Generic (PLEG): container finished" podID="1dfe7c23-1893-4da0-a65b-dfb1c0da89e8" containerID="c4f5149996242bde8cf3f5749a52c72e592a549ea4ab9fa16f994cf8d35c892b" exitCode=0 Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.506386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk9qh" event={"ID":"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8","Type":"ContainerDied","Data":"c4f5149996242bde8cf3f5749a52c72e592a549ea4ab9fa16f994cf8d35c892b"} Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.506436 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk9qh" event={"ID":"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8","Type":"ContainerStarted","Data":"00ff5e6ba5b1df0dbcd8fc4ff8a38b70ca9b5ae909708e29ef4c1708b9dbc007"} Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.517962 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6rzm\" (UniqueName: \"kubernetes.io/projected/0edcf939-f4bf-4aba-84fe-8e44a12fac21-kube-api-access-v6rzm\") pod \"redhat-marketplace-p4d9q\" (UID: \"0edcf939-f4bf-4aba-84fe-8e44a12fac21\") " pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.718132 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:38 crc kubenswrapper[4921]: I0318 12:15:38.896148 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4d9q"] Mar 18 12:15:38 crc kubenswrapper[4921]: W0318 12:15:38.898590 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0edcf939_f4bf_4aba_84fe_8e44a12fac21.slice/crio-c6ba867ed2acb0fb437503440679a2bedad3974011fc54b27ea96f598e46f56a WatchSource:0}: Error finding container c6ba867ed2acb0fb437503440679a2bedad3974011fc54b27ea96f598e46f56a: Status 404 returned error can't find the container with id c6ba867ed2acb0fb437503440679a2bedad3974011fc54b27ea96f598e46f56a Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.516216 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk9qh" event={"ID":"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8","Type":"ContainerStarted","Data":"bb8ab69b264dfb29bf5681750d241d2fcf7947275c9680ff00c51f5976bc6249"} Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.519728 4921 generic.go:334] "Generic (PLEG): container finished" podID="0edcf939-f4bf-4aba-84fe-8e44a12fac21" containerID="8d8827ac5417e35422e3a1d7b0ab1c3e5f5c1cf4f6920317eed05bcf0192c1b8" exitCode=0 Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.520803 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4d9q" event={"ID":"0edcf939-f4bf-4aba-84fe-8e44a12fac21","Type":"ContainerDied","Data":"8d8827ac5417e35422e3a1d7b0ab1c3e5f5c1cf4f6920317eed05bcf0192c1b8"} Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.520844 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4d9q" event={"ID":"0edcf939-f4bf-4aba-84fe-8e44a12fac21","Type":"ContainerStarted","Data":"c6ba867ed2acb0fb437503440679a2bedad3974011fc54b27ea96f598e46f56a"} Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.778881 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4g8n"] Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.780133 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.784430 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.797135 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4g8n"] Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.810767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-utilities\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.812272 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rmnb\" (UniqueName: \"kubernetes.io/projected/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-kube-api-access-4rmnb\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.813057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-catalog-content\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.914601 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-utilities\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.914898 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rmnb\" (UniqueName: \"kubernetes.io/projected/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-kube-api-access-4rmnb\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.915245 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-utilities\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.915433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-catalog-content\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.915710 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-catalog-content\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:39 crc kubenswrapper[4921]: I0318 12:15:39.934280 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rmnb\" (UniqueName: \"kubernetes.io/projected/c452e3b2-05ab-4e09-a6a9-e59aee2e30cd-kube-api-access-4rmnb\") pod \"redhat-operators-k4g8n\" (UID: \"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd\") " pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.096774 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.296575 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4g8n"] Mar 18 12:15:40 crc kubenswrapper[4921]: W0318 12:15:40.308240 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc452e3b2_05ab_4e09_a6a9_e59aee2e30cd.slice/crio-36fd703701292d4be51e88ae794eb706177b6cb2b06d471e97675ee79e7cceaa WatchSource:0}: Error finding container 36fd703701292d4be51e88ae794eb706177b6cb2b06d471e97675ee79e7cceaa: Status 404 returned error can't find the container with id 36fd703701292d4be51e88ae794eb706177b6cb2b06d471e97675ee79e7cceaa Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.528790 4921 generic.go:334] "Generic (PLEG): container finished" podID="1dfe7c23-1893-4da0-a65b-dfb1c0da89e8" containerID="bb8ab69b264dfb29bf5681750d241d2fcf7947275c9680ff00c51f5976bc6249" exitCode=0 Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.528882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk9qh" event={"ID":"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8","Type":"ContainerDied","Data":"bb8ab69b264dfb29bf5681750d241d2fcf7947275c9680ff00c51f5976bc6249"} Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.530871 4921 generic.go:334] "Generic (PLEG): container finished" podID="c452e3b2-05ab-4e09-a6a9-e59aee2e30cd" containerID="08db5143d67e02cee612426357647430826760c91efad46adb303db904fa7f0d" exitCode=0 Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.530964 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4g8n" event={"ID":"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd","Type":"ContainerDied","Data":"08db5143d67e02cee612426357647430826760c91efad46adb303db904fa7f0d"} Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.531001 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4g8n" event={"ID":"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd","Type":"ContainerStarted","Data":"36fd703701292d4be51e88ae794eb706177b6cb2b06d471e97675ee79e7cceaa"} Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.534785 4921 generic.go:334] "Generic (PLEG): container finished" podID="0edcf939-f4bf-4aba-84fe-8e44a12fac21" containerID="4b48b21ea77f420ea73634565a6348d0e620562bfe0ae3159f5413ad3b72d750" exitCode=0 Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.534842 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4d9q" event={"ID":"0edcf939-f4bf-4aba-84fe-8e44a12fac21","Type":"ContainerDied","Data":"4b48b21ea77f420ea73634565a6348d0e620562bfe0ae3159f5413ad3b72d750"} Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.778077 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rm29k"] Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.779274 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.781345 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.804244 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rm29k"] Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.827528 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56b6108-5b3e-40ac-b8f6-f977464feb3b-catalog-content\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.827599 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkt4p\" (UniqueName: \"kubernetes.io/projected/f56b6108-5b3e-40ac-b8f6-f977464feb3b-kube-api-access-tkt4p\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.827655 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56b6108-5b3e-40ac-b8f6-f977464feb3b-utilities\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.928562 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56b6108-5b3e-40ac-b8f6-f977464feb3b-utilities\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.928640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56b6108-5b3e-40ac-b8f6-f977464feb3b-catalog-content\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.928674 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkt4p\" (UniqueName: \"kubernetes.io/projected/f56b6108-5b3e-40ac-b8f6-f977464feb3b-kube-api-access-tkt4p\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.929251 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56b6108-5b3e-40ac-b8f6-f977464feb3b-utilities\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.929273 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56b6108-5b3e-40ac-b8f6-f977464feb3b-catalog-content\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:40 crc kubenswrapper[4921]: I0318 12:15:40.948438 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkt4p\" (UniqueName: \"kubernetes.io/projected/f56b6108-5b3e-40ac-b8f6-f977464feb3b-kube-api-access-tkt4p\") pod \"community-operators-rm29k\" (UID: \"f56b6108-5b3e-40ac-b8f6-f977464feb3b\") " pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.096592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.362753 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rm29k"] Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.542737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk9qh" event={"ID":"1dfe7c23-1893-4da0-a65b-dfb1c0da89e8","Type":"ContainerStarted","Data":"a6e12b9aab09302c5ceb481b257d7a4bc885769993669d1cb50555b462f5d393"} Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.544827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4g8n" event={"ID":"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd","Type":"ContainerStarted","Data":"aab7ca9de9455fe1eb61e67aec96adb629fb3e05de2d402b12db50412434e86f"} Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.547659 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4d9q" event={"ID":"0edcf939-f4bf-4aba-84fe-8e44a12fac21","Type":"ContainerStarted","Data":"e659591d7573e3d9ddbcf8d2f4d23d7544a7461d7f6329f7da434e4b20ab689f"} Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.549656 4921 generic.go:334] "Generic (PLEG): container finished" podID="f56b6108-5b3e-40ac-b8f6-f977464feb3b" containerID="db1830444d10dfe16f365fea276c515ed69150eca6c78135b26242c4554049d8" exitCode=0 Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.549696 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm29k" event={"ID":"f56b6108-5b3e-40ac-b8f6-f977464feb3b","Type":"ContainerDied","Data":"db1830444d10dfe16f365fea276c515ed69150eca6c78135b26242c4554049d8"} Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.549712 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm29k" event={"ID":"f56b6108-5b3e-40ac-b8f6-f977464feb3b","Type":"ContainerStarted","Data":"902a71fd28f6adf83f0216829500c22d06b42d7a633271f9b8b2a0bafc7b7620"} Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.566941 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dk9qh" podStartSLOduration=2.068712979 podStartE2EDuration="4.566915716s" podCreationTimestamp="2026-03-18 12:15:37 +0000 UTC" firstStartedPulling="2026-03-18 12:15:38.508641856 +0000 UTC m=+358.058562535" lastFinishedPulling="2026-03-18 12:15:41.006844623 +0000 UTC m=+360.556765272" observedRunningTime="2026-03-18 12:15:41.566419471 +0000 UTC m=+361.116340110" watchObservedRunningTime="2026-03-18 12:15:41.566915716 +0000 UTC m=+361.116836355" Mar 18 12:15:41 crc kubenswrapper[4921]: I0318 12:15:41.588294 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p4d9q" podStartSLOduration=2.095562602 podStartE2EDuration="3.588275398s" podCreationTimestamp="2026-03-18 12:15:38 +0000 UTC" firstStartedPulling="2026-03-18 12:15:39.523257458 +0000 UTC m=+359.073178097" lastFinishedPulling="2026-03-18 12:15:41.015970254 +0000 UTC m=+360.565890893" observedRunningTime="2026-03-18 12:15:41.587539147 +0000 UTC m=+361.137459786" watchObservedRunningTime="2026-03-18 12:15:41.588275398 +0000 UTC m=+361.138196037" Mar 18 12:15:42 crc kubenswrapper[4921]: I0318 12:15:42.556684 4921 generic.go:334] "Generic (PLEG): container finished" podID="c452e3b2-05ab-4e09-a6a9-e59aee2e30cd" containerID="aab7ca9de9455fe1eb61e67aec96adb629fb3e05de2d402b12db50412434e86f" exitCode=0 Mar 18 12:15:42 crc kubenswrapper[4921]: I0318 12:15:42.556751 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4g8n" event={"ID":"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd","Type":"ContainerDied","Data":"aab7ca9de9455fe1eb61e67aec96adb629fb3e05de2d402b12db50412434e86f"} Mar 18 12:15:43 crc kubenswrapper[4921]: I0318 12:15:43.566394 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4g8n" event={"ID":"c452e3b2-05ab-4e09-a6a9-e59aee2e30cd","Type":"ContainerStarted","Data":"ecd13c90809d914311a52524b56c1f706d86f51fcfb087646bfb579fde58bfd6"} Mar 18 12:15:43 crc kubenswrapper[4921]: I0318 12:15:43.568634 4921 generic.go:334] "Generic (PLEG): container finished" podID="f56b6108-5b3e-40ac-b8f6-f977464feb3b" containerID="a8a87e09fe8f868ff37531d055e751c4daedc02a498b3cafa8746acd1382ad1a" exitCode=0 Mar 18 12:15:43 crc kubenswrapper[4921]: I0318 12:15:43.568684 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm29k" event={"ID":"f56b6108-5b3e-40ac-b8f6-f977464feb3b","Type":"ContainerDied","Data":"a8a87e09fe8f868ff37531d055e751c4daedc02a498b3cafa8746acd1382ad1a"} Mar 18 12:15:43 crc kubenswrapper[4921]: I0318 12:15:43.584157 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4g8n" podStartSLOduration=2.080151155 podStartE2EDuration="4.584135417s" podCreationTimestamp="2026-03-18 12:15:39 +0000 UTC" firstStartedPulling="2026-03-18 12:15:40.533619219 +0000 UTC m=+360.083539868" lastFinishedPulling="2026-03-18 12:15:43.037603491 +0000 UTC m=+362.587524130" observedRunningTime="2026-03-18 12:15:43.583760376 +0000 UTC m=+363.133681035" watchObservedRunningTime="2026-03-18 12:15:43.584135417 +0000 UTC m=+363.134056076" Mar 18 12:15:44 crc kubenswrapper[4921]: I0318 12:15:44.579683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rm29k" event={"ID":"f56b6108-5b3e-40ac-b8f6-f977464feb3b","Type":"ContainerStarted","Data":"ceffadc58176b7e38538b5d6899d9036d6f605c2292feb07f4cce6647beaad43"} Mar 18 12:15:44 crc kubenswrapper[4921]: I0318 12:15:44.600028 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rm29k" podStartSLOduration=2.165759049 podStartE2EDuration="4.600013994s" podCreationTimestamp="2026-03-18 12:15:40 +0000 UTC" firstStartedPulling="2026-03-18 12:15:41.55066284 +0000 UTC m=+361.100583469" lastFinishedPulling="2026-03-18 12:15:43.984917775 +0000 UTC m=+363.534838414" observedRunningTime="2026-03-18 12:15:44.598078089 +0000 UTC m=+364.147998728" watchObservedRunningTime="2026-03-18 12:15:44.600013994 +0000 UTC m=+364.149934633" Mar 18 12:15:47 crc kubenswrapper[4921]: I0318 12:15:47.714357 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:47 crc kubenswrapper[4921]: I0318 12:15:47.715472 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:47 crc kubenswrapper[4921]: I0318 12:15:47.761168 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:48 crc kubenswrapper[4921]: I0318 12:15:48.634020 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dk9qh" Mar 18 12:15:48 crc kubenswrapper[4921]: I0318 12:15:48.718703 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:48 crc kubenswrapper[4921]: I0318 12:15:48.718812 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:48 crc kubenswrapper[4921]: I0318 12:15:48.761446 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:49 crc kubenswrapper[4921]: I0318 12:15:49.652222 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p4d9q" Mar 18 12:15:50 crc kubenswrapper[4921]: I0318 12:15:50.097250 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:50 crc kubenswrapper[4921]: I0318 12:15:50.098076 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:50 crc kubenswrapper[4921]: I0318 12:15:50.164081 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:50 crc kubenswrapper[4921]: I0318 12:15:50.662802 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4g8n" Mar 18 12:15:51 crc kubenswrapper[4921]: I0318 12:15:51.097570 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:51 crc kubenswrapper[4921]: I0318 12:15:51.097632 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:51 crc kubenswrapper[4921]: I0318 12:15:51.168295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:15:51 crc kubenswrapper[4921]: I0318 12:15:51.649829 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rm29k" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.135535 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563936-qlwwl"] Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.137143 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.139060 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.139893 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.142367 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.143515 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-qlwwl"] Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.294214 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws776\" (UniqueName: \"kubernetes.io/projected/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b-kube-api-access-ws776\") pod \"auto-csr-approver-29563936-qlwwl\" (UID: \"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b\") " pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.395830 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws776\" (UniqueName: \"kubernetes.io/projected/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b-kube-api-access-ws776\") pod \"auto-csr-approver-29563936-qlwwl\" (UID: \"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b\") " pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.416405 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws776\" (UniqueName: \"kubernetes.io/projected/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b-kube-api-access-ws776\") pod \"auto-csr-approver-29563936-qlwwl\" (UID: \"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b\") " pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.463254 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:00 crc kubenswrapper[4921]: I0318 12:16:00.866695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-qlwwl"] Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.257189 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" podUID="dbee3eb4-1971-45b8-b0a5-3819407584ec" containerName="registry" containerID="cri-o://57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade" gracePeriod=30 Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.630606 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.680150 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" event={"ID":"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b","Type":"ContainerStarted","Data":"2a9820c6608013a450dbc9b03b9cecbfa509e3dff6c2c600fd924a68720f685d"} Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.681825 4921 generic.go:334] "Generic (PLEG): container finished" podID="dbee3eb4-1971-45b8-b0a5-3819407584ec" containerID="57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade" exitCode=0 Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.681866 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.681879 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" event={"ID":"dbee3eb4-1971-45b8-b0a5-3819407584ec","Type":"ContainerDied","Data":"57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade"} Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.681921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x4bzs" event={"ID":"dbee3eb4-1971-45b8-b0a5-3819407584ec","Type":"ContainerDied","Data":"617785ee0960af458b79d3685989a134e54459d2cb2478e920c396d073756b77"} Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.681943 4921 scope.go:117] "RemoveContainer" containerID="57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.701325 4921 scope.go:117] "RemoveContainer" containerID="57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade" Mar 18 12:16:01 crc kubenswrapper[4921]: E0318 12:16:01.701791 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade\": container with ID starting with 57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade not found: ID does not exist" containerID="57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.701822 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade"} err="failed to get container status \"57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade\": rpc error: code = NotFound desc = could not find container \"57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade\": container with ID starting with 57ab0b12a991e32507beeb940f7f26ab3a156bd28587916d5416b404043b5ade not found: ID does not exist" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815393 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cml8b\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-kube-api-access-cml8b\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815492 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-bound-sa-token\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815536 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-trusted-ca\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815575 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbee3eb4-1971-45b8-b0a5-3819407584ec-ca-trust-extracted\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815603 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-certificates\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815802 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815848 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbee3eb4-1971-45b8-b0a5-3819407584ec-installation-pull-secrets\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.815881 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-tls\") pod \"dbee3eb4-1971-45b8-b0a5-3819407584ec\" (UID: \"dbee3eb4-1971-45b8-b0a5-3819407584ec\") " Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.817891 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.817913 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.822513 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.823049 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.824475 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbee3eb4-1971-45b8-b0a5-3819407584ec-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.825160 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-kube-api-access-cml8b" (OuterVolumeSpecName: "kube-api-access-cml8b") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "kube-api-access-cml8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.834278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.836405 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee3eb4-1971-45b8-b0a5-3819407584ec-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "dbee3eb4-1971-45b8-b0a5-3819407584ec" (UID: "dbee3eb4-1971-45b8-b0a5-3819407584ec"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917281 4921 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbee3eb4-1971-45b8-b0a5-3819407584ec-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917321 4921 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917360 4921 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbee3eb4-1971-45b8-b0a5-3819407584ec-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917377 4921 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917389 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cml8b\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-kube-api-access-cml8b\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917402 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbee3eb4-1971-45b8-b0a5-3819407584ec-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:01 crc kubenswrapper[4921]: I0318 12:16:01.917436 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbee3eb4-1971-45b8-b0a5-3819407584ec-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:02 crc kubenswrapper[4921]: I0318 12:16:02.026349 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x4bzs"] Mar 18 12:16:02 crc kubenswrapper[4921]: I0318 12:16:02.035256 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x4bzs"] Mar 18 12:16:02 crc kubenswrapper[4921]: I0318 12:16:02.688484 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" event={"ID":"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b","Type":"ContainerStarted","Data":"c550627b537072e64aefcb8a2a6b0a286d8ee50a07c31c1b370fbef03062aadd"} Mar 18 12:16:02 crc kubenswrapper[4921]: I0318 12:16:02.703080 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" podStartSLOduration=1.601931217 podStartE2EDuration="2.703062939s" podCreationTimestamp="2026-03-18 12:16:00 +0000 UTC" firstStartedPulling="2026-03-18 12:16:00.874395844 +0000 UTC m=+380.424316483" lastFinishedPulling="2026-03-18 12:16:01.975527566 +0000 UTC m=+381.525448205" observedRunningTime="2026-03-18 12:16:02.701662349 +0000 UTC m=+382.251582988" watchObservedRunningTime="2026-03-18 12:16:02.703062939 +0000 UTC m=+382.252983578" Mar 18 12:16:03 crc kubenswrapper[4921]: I0318 12:16:03.217755 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbee3eb4-1971-45b8-b0a5-3819407584ec" path="/var/lib/kubelet/pods/dbee3eb4-1971-45b8-b0a5-3819407584ec/volumes" Mar 18 12:16:03 crc kubenswrapper[4921]: I0318 12:16:03.702377 4921 generic.go:334] "Generic (PLEG): container finished" podID="85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b" containerID="c550627b537072e64aefcb8a2a6b0a286d8ee50a07c31c1b370fbef03062aadd" exitCode=0 Mar 18 12:16:03 crc kubenswrapper[4921]: I0318 12:16:03.702421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" event={"ID":"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b","Type":"ContainerDied","Data":"c550627b537072e64aefcb8a2a6b0a286d8ee50a07c31c1b370fbef03062aadd"} Mar 18 12:16:04 crc kubenswrapper[4921]: I0318 12:16:04.933700 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:04 crc kubenswrapper[4921]: I0318 12:16:04.988782 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws776\" (UniqueName: \"kubernetes.io/projected/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b-kube-api-access-ws776\") pod \"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b\" (UID: \"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b\") " Mar 18 12:16:04 crc kubenswrapper[4921]: I0318 12:16:04.994493 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b-kube-api-access-ws776" (OuterVolumeSpecName: "kube-api-access-ws776") pod "85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b" (UID: "85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b"). InnerVolumeSpecName "kube-api-access-ws776". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:16:05 crc kubenswrapper[4921]: I0318 12:16:05.090271 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws776\" (UniqueName: \"kubernetes.io/projected/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b-kube-api-access-ws776\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:05 crc kubenswrapper[4921]: I0318 12:16:05.717131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" event={"ID":"85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b","Type":"ContainerDied","Data":"2a9820c6608013a450dbc9b03b9cecbfa509e3dff6c2c600fd924a68720f685d"} Mar 18 12:16:05 crc kubenswrapper[4921]: I0318 12:16:05.717536 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9820c6608013a450dbc9b03b9cecbfa509e3dff6c2c600fd924a68720f685d" Mar 18 12:16:05 crc kubenswrapper[4921]: I0318 12:16:05.717174 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-qlwwl" Mar 18 12:16:47 crc kubenswrapper[4921]: I0318 12:16:47.081577 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:16:47 crc kubenswrapper[4921]: I0318 12:16:47.082365 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:17 crc kubenswrapper[4921]: I0318 12:17:17.081398 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:17:17 crc kubenswrapper[4921]: I0318 12:17:17.081975 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.081320 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.081784 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.081829 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.082394 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90c2627d76f56111b8091bf7cf164f6179d0606e616e7e916cb477a7a9cb4d04"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.082444 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://90c2627d76f56111b8091bf7cf164f6179d0606e616e7e916cb477a7a9cb4d04" gracePeriod=600 Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.435940 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="90c2627d76f56111b8091bf7cf164f6179d0606e616e7e916cb477a7a9cb4d04" exitCode=0 Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.436019 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"90c2627d76f56111b8091bf7cf164f6179d0606e616e7e916cb477a7a9cb4d04"} Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.436055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"fc5074ee409b0ac06bf766e43a02ba25eb50275e9168fd38e188e49a4ed789f7"} Mar 18 12:17:47 crc kubenswrapper[4921]: I0318 12:17:47.436077 4921 scope.go:117] "RemoveContainer" containerID="6ab8805d660292aca69d1573e05604dab7bfe6509e503cc63788830232c78aa9" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.133854 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563938-4vb5p"] Mar 18 12:18:00 crc kubenswrapper[4921]: E0318 12:18:00.135661 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee3eb4-1971-45b8-b0a5-3819407584ec" containerName="registry" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.135771 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee3eb4-1971-45b8-b0a5-3819407584ec" containerName="registry" Mar 18 12:18:00 crc kubenswrapper[4921]: E0318 12:18:00.135863 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b" containerName="oc" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.135949 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b" containerName="oc" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.136953 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b" containerName="oc" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.137079 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbee3eb4-1971-45b8-b0a5-3819407584ec" containerName="registry" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.137630 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.140345 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.141242 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.142330 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.149176 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-4vb5p"] Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.298057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngbxl\" (UniqueName: \"kubernetes.io/projected/9a16bce2-a835-450d-9101-c9cbd238b42b-kube-api-access-ngbxl\") pod \"auto-csr-approver-29563938-4vb5p\" (UID: \"9a16bce2-a835-450d-9101-c9cbd238b42b\") " pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.398954 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngbxl\" (UniqueName: \"kubernetes.io/projected/9a16bce2-a835-450d-9101-c9cbd238b42b-kube-api-access-ngbxl\") pod \"auto-csr-approver-29563938-4vb5p\" (UID: \"9a16bce2-a835-450d-9101-c9cbd238b42b\") " pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.418841 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngbxl\" (UniqueName: \"kubernetes.io/projected/9a16bce2-a835-450d-9101-c9cbd238b42b-kube-api-access-ngbxl\") pod \"auto-csr-approver-29563938-4vb5p\" (UID: \"9a16bce2-a835-450d-9101-c9cbd238b42b\") " pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.452833 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.858637 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-4vb5p"] Mar 18 12:18:00 crc kubenswrapper[4921]: I0318 12:18:00.871269 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:18:01 crc kubenswrapper[4921]: I0318 12:18:01.522148 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" event={"ID":"9a16bce2-a835-450d-9101-c9cbd238b42b","Type":"ContainerStarted","Data":"de7dc25b5b849ef1eff12c008ac5a88490086805aa2e7ffba34549e2cddcf7a2"} Mar 18 12:18:03 crc kubenswrapper[4921]: I0318 12:18:03.533148 4921 generic.go:334] "Generic (PLEG): container finished" podID="9a16bce2-a835-450d-9101-c9cbd238b42b" containerID="c418b070ea1c94a129f2580a7157f13eb622ed973a4c14b4e4f56dffe0ae31f9" exitCode=0 Mar 18 12:18:03 crc kubenswrapper[4921]: I0318 12:18:03.533232 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" event={"ID":"9a16bce2-a835-450d-9101-c9cbd238b42b","Type":"ContainerDied","Data":"c418b070ea1c94a129f2580a7157f13eb622ed973a4c14b4e4f56dffe0ae31f9"} Mar 18 12:18:04 crc kubenswrapper[4921]: I0318 12:18:04.792764 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:04 crc kubenswrapper[4921]: I0318 12:18:04.953059 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngbxl\" (UniqueName: \"kubernetes.io/projected/9a16bce2-a835-450d-9101-c9cbd238b42b-kube-api-access-ngbxl\") pod \"9a16bce2-a835-450d-9101-c9cbd238b42b\" (UID: \"9a16bce2-a835-450d-9101-c9cbd238b42b\") " Mar 18 12:18:04 crc kubenswrapper[4921]: I0318 12:18:04.959616 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a16bce2-a835-450d-9101-c9cbd238b42b-kube-api-access-ngbxl" (OuterVolumeSpecName: "kube-api-access-ngbxl") pod "9a16bce2-a835-450d-9101-c9cbd238b42b" (UID: "9a16bce2-a835-450d-9101-c9cbd238b42b"). InnerVolumeSpecName "kube-api-access-ngbxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:05 crc kubenswrapper[4921]: I0318 12:18:05.054615 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngbxl\" (UniqueName: \"kubernetes.io/projected/9a16bce2-a835-450d-9101-c9cbd238b42b-kube-api-access-ngbxl\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:05 crc kubenswrapper[4921]: I0318 12:18:05.547366 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" event={"ID":"9a16bce2-a835-450d-9101-c9cbd238b42b","Type":"ContainerDied","Data":"de7dc25b5b849ef1eff12c008ac5a88490086805aa2e7ffba34549e2cddcf7a2"} Mar 18 12:18:05 crc kubenswrapper[4921]: I0318 12:18:05.547411 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de7dc25b5b849ef1eff12c008ac5a88490086805aa2e7ffba34549e2cddcf7a2" Mar 18 12:18:05 crc kubenswrapper[4921]: I0318 12:18:05.547468 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-4vb5p" Mar 18 12:18:05 crc kubenswrapper[4921]: I0318 12:18:05.850912 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-b8gp7"] Mar 18 12:18:05 crc kubenswrapper[4921]: I0318 12:18:05.855262 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-b8gp7"] Mar 18 12:18:07 crc kubenswrapper[4921]: I0318 12:18:07.220992 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d7472e-2ed9-434f-b1f0-6147f9452a11" path="/var/lib/kubelet/pods/58d7472e-2ed9-434f-b1f0-6147f9452a11/volumes" Mar 18 12:18:47 crc kubenswrapper[4921]: I0318 12:18:47.958566 4921 scope.go:117] "RemoveContainer" containerID="7207a9b2365c5b33874b7438b0e9cd0b03112141d57ccaf049c3c37db9e7a7d5" Mar 18 12:19:47 crc kubenswrapper[4921]: I0318 12:19:47.081476 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:19:47 crc kubenswrapper[4921]: I0318 12:19:47.082029 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:19:47 crc kubenswrapper[4921]: I0318 12:19:47.988777 4921 scope.go:117] "RemoveContainer" containerID="46fe56158e9f7b1129d5ba778b7a121499e643f269f2251f712fce2c80db70f7" Mar 18 12:19:48 crc kubenswrapper[4921]: I0318 12:19:48.025336 4921 scope.go:117] "RemoveContainer" containerID="a808a8be1a2d62fbe6700eaf49d5a69bec8c366233449a880f421b202dcf95d9" Mar 18 12:19:48 crc kubenswrapper[4921]: I0318 12:19:48.039853 4921 scope.go:117] "RemoveContainer" containerID="6aad3434490562b863b21fa2dafd86ca52f72e6c83c829ab4e9e0ff49ff875a0" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.149905 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563940-l9d7b"] Mar 18 12:20:00 crc kubenswrapper[4921]: E0318 12:20:00.151007 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a16bce2-a835-450d-9101-c9cbd238b42b" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.151024 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a16bce2-a835-450d-9101-c9cbd238b42b" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.151195 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a16bce2-a835-450d-9101-c9cbd238b42b" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.151655 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.154437 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.154823 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.154834 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.170104 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-l9d7b"] Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.330457 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzx4j\" (UniqueName: \"kubernetes.io/projected/9191a818-dc78-4b2f-801b-d965eeae5c8b-kube-api-access-fzx4j\") pod \"auto-csr-approver-29563940-l9d7b\" (UID: \"9191a818-dc78-4b2f-801b-d965eeae5c8b\") " pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.431602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzx4j\" (UniqueName: \"kubernetes.io/projected/9191a818-dc78-4b2f-801b-d965eeae5c8b-kube-api-access-fzx4j\") pod \"auto-csr-approver-29563940-l9d7b\" (UID: \"9191a818-dc78-4b2f-801b-d965eeae5c8b\") " pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.451450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzx4j\" (UniqueName: \"kubernetes.io/projected/9191a818-dc78-4b2f-801b-d965eeae5c8b-kube-api-access-fzx4j\") pod \"auto-csr-approver-29563940-l9d7b\" (UID: \"9191a818-dc78-4b2f-801b-d965eeae5c8b\") " pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.474268 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:00 crc kubenswrapper[4921]: I0318 12:20:00.674250 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-l9d7b"] Mar 18 12:20:01 crc kubenswrapper[4921]: I0318 12:20:01.230243 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" event={"ID":"9191a818-dc78-4b2f-801b-d965eeae5c8b","Type":"ContainerStarted","Data":"7edbbf1e74ec1d18e81ba9f07ea398382450b96a62a8773e3e92c19c22ab0838"} Mar 18 12:20:03 crc kubenswrapper[4921]: I0318 12:20:03.243824 4921 generic.go:334] "Generic (PLEG): container finished" podID="9191a818-dc78-4b2f-801b-d965eeae5c8b" containerID="00488fa9a37401bc2a8a43ebf8258eb57a6a8ff740d85bf5dffb3349df358a78" exitCode=0 Mar 18 12:20:03 crc kubenswrapper[4921]: I0318 12:20:03.243989 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" event={"ID":"9191a818-dc78-4b2f-801b-d965eeae5c8b","Type":"ContainerDied","Data":"00488fa9a37401bc2a8a43ebf8258eb57a6a8ff740d85bf5dffb3349df358a78"} Mar 18 12:20:04 crc kubenswrapper[4921]: I0318 12:20:04.449529 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:04 crc kubenswrapper[4921]: I0318 12:20:04.581603 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzx4j\" (UniqueName: \"kubernetes.io/projected/9191a818-dc78-4b2f-801b-d965eeae5c8b-kube-api-access-fzx4j\") pod \"9191a818-dc78-4b2f-801b-d965eeae5c8b\" (UID: \"9191a818-dc78-4b2f-801b-d965eeae5c8b\") " Mar 18 12:20:04 crc kubenswrapper[4921]: I0318 12:20:04.588445 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9191a818-dc78-4b2f-801b-d965eeae5c8b-kube-api-access-fzx4j" (OuterVolumeSpecName: "kube-api-access-fzx4j") pod "9191a818-dc78-4b2f-801b-d965eeae5c8b" (UID: "9191a818-dc78-4b2f-801b-d965eeae5c8b"). InnerVolumeSpecName "kube-api-access-fzx4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:20:04 crc kubenswrapper[4921]: I0318 12:20:04.683211 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzx4j\" (UniqueName: \"kubernetes.io/projected/9191a818-dc78-4b2f-801b-d965eeae5c8b-kube-api-access-fzx4j\") on node \"crc\" DevicePath \"\"" Mar 18 12:20:05 crc kubenswrapper[4921]: I0318 12:20:05.267906 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" event={"ID":"9191a818-dc78-4b2f-801b-d965eeae5c8b","Type":"ContainerDied","Data":"7edbbf1e74ec1d18e81ba9f07ea398382450b96a62a8773e3e92c19c22ab0838"} Mar 18 12:20:05 crc kubenswrapper[4921]: I0318 12:20:05.267968 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7edbbf1e74ec1d18e81ba9f07ea398382450b96a62a8773e3e92c19c22ab0838" Mar 18 12:20:05 crc kubenswrapper[4921]: I0318 12:20:05.268036 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-l9d7b" Mar 18 12:20:05 crc kubenswrapper[4921]: I0318 12:20:05.511938 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-mqj9b"] Mar 18 12:20:05 crc kubenswrapper[4921]: I0318 12:20:05.514993 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-mqj9b"] Mar 18 12:20:07 crc kubenswrapper[4921]: I0318 12:20:07.215874 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd59b71-2947-4e73-872d-4e84bb7413bc" path="/var/lib/kubelet/pods/2bd59b71-2947-4e73-872d-4e84bb7413bc/volumes" Mar 18 12:20:17 crc kubenswrapper[4921]: I0318 12:20:17.080994 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:20:17 crc kubenswrapper[4921]: I0318 12:20:17.081663 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.082144 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.083148 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.083220 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.084213 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc5074ee409b0ac06bf766e43a02ba25eb50275e9168fd38e188e49a4ed789f7"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.084300 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://fc5074ee409b0ac06bf766e43a02ba25eb50275e9168fd38e188e49a4ed789f7" gracePeriod=600 Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.519993 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="fc5074ee409b0ac06bf766e43a02ba25eb50275e9168fd38e188e49a4ed789f7" exitCode=0 Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.520054 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"fc5074ee409b0ac06bf766e43a02ba25eb50275e9168fd38e188e49a4ed789f7"} Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.520389 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"766d7dfa17c3a2f9b917da54f565e4c9feb5034e40822c9f54b662ea59c3b6dc"} Mar 18 12:20:47 crc kubenswrapper[4921]: I0318 12:20:47.520414 4921 scope.go:117] "RemoveContainer" containerID="90c2627d76f56111b8091bf7cf164f6179d0606e616e7e916cb477a7a9cb4d04" Mar 18 12:20:48 crc kubenswrapper[4921]: I0318 12:20:48.083703 4921 scope.go:117] "RemoveContainer" containerID="c9613ac5784ba0aab2ba449f1282c59af9fcf8704844927504ca4638bc9d70a6" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.145140 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563942-84ssn"] Mar 18 12:22:00 crc kubenswrapper[4921]: E0318 12:22:00.145951 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9191a818-dc78-4b2f-801b-d965eeae5c8b" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.145968 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9191a818-dc78-4b2f-801b-d965eeae5c8b" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.146093 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9191a818-dc78-4b2f-801b-d965eeae5c8b" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.146492 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.151596 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.152970 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.153206 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.157165 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-84ssn"] Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.273144 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrs9\" (UniqueName: \"kubernetes.io/projected/c168d5bc-4e2c-4365-9951-b9bd84c375d9-kube-api-access-kfrs9\") pod \"auto-csr-approver-29563942-84ssn\" (UID: \"c168d5bc-4e2c-4365-9951-b9bd84c375d9\") " pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.374130 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrs9\" (UniqueName: \"kubernetes.io/projected/c168d5bc-4e2c-4365-9951-b9bd84c375d9-kube-api-access-kfrs9\") pod \"auto-csr-approver-29563942-84ssn\" (UID: \"c168d5bc-4e2c-4365-9951-b9bd84c375d9\") " pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.394431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrs9\" (UniqueName: \"kubernetes.io/projected/c168d5bc-4e2c-4365-9951-b9bd84c375d9-kube-api-access-kfrs9\") pod \"auto-csr-approver-29563942-84ssn\" (UID: \"c168d5bc-4e2c-4365-9951-b9bd84c375d9\") " pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.464511 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.852809 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-84ssn"] Mar 18 12:22:00 crc kubenswrapper[4921]: I0318 12:22:00.989973 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-84ssn" event={"ID":"c168d5bc-4e2c-4365-9951-b9bd84c375d9","Type":"ContainerStarted","Data":"d87f2b132a6a7892b6d90b8d3335da91de226e1d898c704bc3240a0e731897b0"} Mar 18 12:22:03 crc kubenswrapper[4921]: I0318 12:22:03.005746 4921 generic.go:334] "Generic (PLEG): container finished" podID="c168d5bc-4e2c-4365-9951-b9bd84c375d9" containerID="7ced53491cfad4ac4d58b1a2ad9addb90fd487837b693e043699afdb7f0c658c" exitCode=0 Mar 18 12:22:03 crc kubenswrapper[4921]: I0318 12:22:03.005815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-84ssn" event={"ID":"c168d5bc-4e2c-4365-9951-b9bd84c375d9","Type":"ContainerDied","Data":"7ced53491cfad4ac4d58b1a2ad9addb90fd487837b693e043699afdb7f0c658c"} Mar 18 12:22:04 crc kubenswrapper[4921]: I0318 12:22:04.211831 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:04 crc kubenswrapper[4921]: I0318 12:22:04.324487 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrs9\" (UniqueName: \"kubernetes.io/projected/c168d5bc-4e2c-4365-9951-b9bd84c375d9-kube-api-access-kfrs9\") pod \"c168d5bc-4e2c-4365-9951-b9bd84c375d9\" (UID: \"c168d5bc-4e2c-4365-9951-b9bd84c375d9\") " Mar 18 12:22:04 crc kubenswrapper[4921]: I0318 12:22:04.331049 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c168d5bc-4e2c-4365-9951-b9bd84c375d9-kube-api-access-kfrs9" (OuterVolumeSpecName: "kube-api-access-kfrs9") pod "c168d5bc-4e2c-4365-9951-b9bd84c375d9" (UID: "c168d5bc-4e2c-4365-9951-b9bd84c375d9"). InnerVolumeSpecName "kube-api-access-kfrs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:22:04 crc kubenswrapper[4921]: I0318 12:22:04.425604 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrs9\" (UniqueName: \"kubernetes.io/projected/c168d5bc-4e2c-4365-9951-b9bd84c375d9-kube-api-access-kfrs9\") on node \"crc\" DevicePath \"\"" Mar 18 12:22:05 crc kubenswrapper[4921]: I0318 12:22:05.020341 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-84ssn" event={"ID":"c168d5bc-4e2c-4365-9951-b9bd84c375d9","Type":"ContainerDied","Data":"d87f2b132a6a7892b6d90b8d3335da91de226e1d898c704bc3240a0e731897b0"} Mar 18 12:22:05 crc kubenswrapper[4921]: I0318 12:22:05.020808 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87f2b132a6a7892b6d90b8d3335da91de226e1d898c704bc3240a0e731897b0" Mar 18 12:22:05 crc kubenswrapper[4921]: I0318 12:22:05.020578 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-84ssn" Mar 18 12:22:05 crc kubenswrapper[4921]: I0318 12:22:05.274793 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-qlwwl"] Mar 18 12:22:05 crc kubenswrapper[4921]: I0318 12:22:05.279033 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-qlwwl"] Mar 18 12:22:07 crc kubenswrapper[4921]: I0318 12:22:07.217086 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b" path="/var/lib/kubelet/pods/85e1e5fc-03ca-4cf9-a872-ea30dfa7ea9b/volumes" Mar 18 12:22:18 crc kubenswrapper[4921]: I0318 12:22:18.304851 4921 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.094156 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nj95j"] Mar 18 12:22:28 crc kubenswrapper[4921]: E0318 12:22:28.095489 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c168d5bc-4e2c-4365-9951-b9bd84c375d9" containerName="oc" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.095505 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c168d5bc-4e2c-4365-9951-b9bd84c375d9" containerName="oc" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.095844 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c168d5bc-4e2c-4365-9951-b9bd84c375d9" containerName="oc" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.096569 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.114427 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nj95j"] Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.155189 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62tq\" (UniqueName: \"kubernetes.io/projected/879097c7-8a7d-43c8-8189-2e876d3d8661-kube-api-access-t62tq\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.155329 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-catalog-content\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.155536 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-utilities\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.256484 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-catalog-content\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.256597 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-utilities\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.256633 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t62tq\" (UniqueName: \"kubernetes.io/projected/879097c7-8a7d-43c8-8189-2e876d3d8661-kube-api-access-t62tq\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.257093 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-catalog-content\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.257147 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-utilities\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.280153 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62tq\" (UniqueName: \"kubernetes.io/projected/879097c7-8a7d-43c8-8189-2e876d3d8661-kube-api-access-t62tq\") pod \"community-operators-nj95j\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.418085 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:28 crc kubenswrapper[4921]: I0318 12:22:28.635851 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nj95j"] Mar 18 12:22:29 crc kubenswrapper[4921]: I0318 12:22:29.160059 4921 generic.go:334] "Generic (PLEG): container finished" podID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerID="eb4c552fabed80755311bd8a42ac72271e9496998a04f79533849c071ae1a664" exitCode=0 Mar 18 12:22:29 crc kubenswrapper[4921]: I0318 12:22:29.160104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerDied","Data":"eb4c552fabed80755311bd8a42ac72271e9496998a04f79533849c071ae1a664"} Mar 18 12:22:29 crc kubenswrapper[4921]: I0318 12:22:29.160164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerStarted","Data":"9b4e0a62cd2bcffd1a566020a9aea28a304059ca65c23ea706723df5384ced19"} Mar 18 12:22:30 crc kubenswrapper[4921]: I0318 12:22:30.167812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerStarted","Data":"71bf206eec4893b803bd784eadb3584d29ec6c648d8719bed32a05e61b1b2b14"} Mar 18 12:22:31 crc kubenswrapper[4921]: I0318 12:22:31.174925 4921 generic.go:334] "Generic (PLEG): container finished" podID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerID="71bf206eec4893b803bd784eadb3584d29ec6c648d8719bed32a05e61b1b2b14" exitCode=0 Mar 18 12:22:31 crc kubenswrapper[4921]: I0318 12:22:31.174965 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerDied","Data":"71bf206eec4893b803bd784eadb3584d29ec6c648d8719bed32a05e61b1b2b14"} Mar 18 12:22:32 crc kubenswrapper[4921]: I0318 12:22:32.186908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerStarted","Data":"006f922b375edfcbc5d9e329b888cf25b68199720d8d29f0c723fefe72b3ea14"} Mar 18 12:22:32 crc kubenswrapper[4921]: I0318 12:22:32.209324 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nj95j" podStartSLOduration=1.7971721029999999 podStartE2EDuration="4.209310657s" podCreationTimestamp="2026-03-18 12:22:28 +0000 UTC" firstStartedPulling="2026-03-18 12:22:29.163247646 +0000 UTC m=+768.713168285" lastFinishedPulling="2026-03-18 12:22:31.57538619 +0000 UTC m=+771.125306839" observedRunningTime="2026-03-18 12:22:32.205364275 +0000 UTC m=+771.755284924" watchObservedRunningTime="2026-03-18 12:22:32.209310657 +0000 UTC m=+771.759231296" Mar 18 12:22:38 crc kubenswrapper[4921]: I0318 12:22:38.419573 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:38 crc kubenswrapper[4921]: I0318 12:22:38.419923 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:38 crc kubenswrapper[4921]: I0318 12:22:38.456195 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:39 crc kubenswrapper[4921]: I0318 12:22:39.265756 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:39 crc kubenswrapper[4921]: I0318 12:22:39.306689 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nj95j"] Mar 18 12:22:41 crc kubenswrapper[4921]: I0318 12:22:41.235722 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nj95j" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="registry-server" containerID="cri-o://006f922b375edfcbc5d9e329b888cf25b68199720d8d29f0c723fefe72b3ea14" gracePeriod=2 Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.243341 4921 generic.go:334] "Generic (PLEG): container finished" podID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerID="006f922b375edfcbc5d9e329b888cf25b68199720d8d29f0c723fefe72b3ea14" exitCode=0 Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.243405 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerDied","Data":"006f922b375edfcbc5d9e329b888cf25b68199720d8d29f0c723fefe72b3ea14"} Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.781900 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.829464 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-catalog-content\") pod \"879097c7-8a7d-43c8-8189-2e876d3d8661\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.829514 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-utilities\") pod \"879097c7-8a7d-43c8-8189-2e876d3d8661\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.829564 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t62tq\" (UniqueName: \"kubernetes.io/projected/879097c7-8a7d-43c8-8189-2e876d3d8661-kube-api-access-t62tq\") pod \"879097c7-8a7d-43c8-8189-2e876d3d8661\" (UID: \"879097c7-8a7d-43c8-8189-2e876d3d8661\") " Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.830511 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-utilities" (OuterVolumeSpecName: "utilities") pod "879097c7-8a7d-43c8-8189-2e876d3d8661" (UID: "879097c7-8a7d-43c8-8189-2e876d3d8661"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.835032 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879097c7-8a7d-43c8-8189-2e876d3d8661-kube-api-access-t62tq" (OuterVolumeSpecName: "kube-api-access-t62tq") pod "879097c7-8a7d-43c8-8189-2e876d3d8661" (UID: "879097c7-8a7d-43c8-8189-2e876d3d8661"). InnerVolumeSpecName "kube-api-access-t62tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.883499 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "879097c7-8a7d-43c8-8189-2e876d3d8661" (UID: "879097c7-8a7d-43c8-8189-2e876d3d8661"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.931333 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.931375 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/879097c7-8a7d-43c8-8189-2e876d3d8661-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:22:42 crc kubenswrapper[4921]: I0318 12:22:42.931391 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t62tq\" (UniqueName: \"kubernetes.io/projected/879097c7-8a7d-43c8-8189-2e876d3d8661-kube-api-access-t62tq\") on node \"crc\" DevicePath \"\"" Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.251814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nj95j" event={"ID":"879097c7-8a7d-43c8-8189-2e876d3d8661","Type":"ContainerDied","Data":"9b4e0a62cd2bcffd1a566020a9aea28a304059ca65c23ea706723df5384ced19"} Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.251875 4921 scope.go:117] "RemoveContainer" containerID="006f922b375edfcbc5d9e329b888cf25b68199720d8d29f0c723fefe72b3ea14" Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.251881 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nj95j" Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.270198 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nj95j"] Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.272884 4921 scope.go:117] "RemoveContainer" containerID="71bf206eec4893b803bd784eadb3584d29ec6c648d8719bed32a05e61b1b2b14" Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.274920 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nj95j"] Mar 18 12:22:43 crc kubenswrapper[4921]: I0318 12:22:43.293291 4921 scope.go:117] "RemoveContainer" containerID="eb4c552fabed80755311bd8a42ac72271e9496998a04f79533849c071ae1a664" Mar 18 12:22:45 crc kubenswrapper[4921]: I0318 12:22:45.220913 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" path="/var/lib/kubelet/pods/879097c7-8a7d-43c8-8189-2e876d3d8661/volumes" Mar 18 12:22:47 crc kubenswrapper[4921]: I0318 12:22:47.081262 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:22:47 crc kubenswrapper[4921]: I0318 12:22:47.081329 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:22:48 crc kubenswrapper[4921]: I0318 12:22:48.139401 4921 scope.go:117] "RemoveContainer" containerID="c550627b537072e64aefcb8a2a6b0a286d8ee50a07c31c1b370fbef03062aadd" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.514554 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-629h6"] Mar 18 12:22:55 crc kubenswrapper[4921]: E0318 12:22:55.515154 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="registry-server" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.515166 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="registry-server" Mar 18 12:22:55 crc kubenswrapper[4921]: E0318 12:22:55.515174 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="extract-utilities" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.515180 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="extract-utilities" Mar 18 12:22:55 crc kubenswrapper[4921]: E0318 12:22:55.515197 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="extract-content" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.515205 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="extract-content" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.515289 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="879097c7-8a7d-43c8-8189-2e876d3d8661" containerName="registry-server" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.515919 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.532708 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-629h6"] Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.677853 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhwj\" (UniqueName: \"kubernetes.io/projected/bca50ce0-aef5-49b8-a070-14e2d83841bc-kube-api-access-2dhwj\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.677989 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-utilities\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.678045 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-catalog-content\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.778757 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhwj\" (UniqueName: \"kubernetes.io/projected/bca50ce0-aef5-49b8-a070-14e2d83841bc-kube-api-access-2dhwj\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.778814 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-utilities\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.778839 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-catalog-content\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.779322 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-utilities\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.779369 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-catalog-content\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.795986 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhwj\" (UniqueName: \"kubernetes.io/projected/bca50ce0-aef5-49b8-a070-14e2d83841bc-kube-api-access-2dhwj\") pod \"redhat-marketplace-629h6\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:55 crc kubenswrapper[4921]: I0318 12:22:55.869757 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:22:56 crc kubenswrapper[4921]: I0318 12:22:56.089690 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-629h6"] Mar 18 12:22:56 crc kubenswrapper[4921]: I0318 12:22:56.329685 4921 generic.go:334] "Generic (PLEG): container finished" podID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerID="b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10" exitCode=0 Mar 18 12:22:56 crc kubenswrapper[4921]: I0318 12:22:56.329825 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-629h6" event={"ID":"bca50ce0-aef5-49b8-a070-14e2d83841bc","Type":"ContainerDied","Data":"b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10"} Mar 18 12:22:56 crc kubenswrapper[4921]: I0318 12:22:56.330047 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-629h6" event={"ID":"bca50ce0-aef5-49b8-a070-14e2d83841bc","Type":"ContainerStarted","Data":"363aaf5fd96b69429fc8e21c6c866704a124b3d76b56ef3a7109b07b8fa03388"} Mar 18 12:22:57 crc kubenswrapper[4921]: I0318 12:22:57.336977 4921 generic.go:334] "Generic (PLEG): container finished" podID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerID="4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10" exitCode=0 Mar 18 12:22:57 crc kubenswrapper[4921]: I0318 12:22:57.337020 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-629h6" event={"ID":"bca50ce0-aef5-49b8-a070-14e2d83841bc","Type":"ContainerDied","Data":"4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10"} Mar 18 12:22:58 crc kubenswrapper[4921]: I0318 12:22:58.345145 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-629h6" event={"ID":"bca50ce0-aef5-49b8-a070-14e2d83841bc","Type":"ContainerStarted","Data":"098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4"} Mar 18 12:22:58 crc kubenswrapper[4921]: I0318 12:22:58.366218 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-629h6" podStartSLOduration=1.9893493370000002 podStartE2EDuration="3.366192403s" podCreationTimestamp="2026-03-18 12:22:55 +0000 UTC" firstStartedPulling="2026-03-18 12:22:56.331303938 +0000 UTC m=+795.881224587" lastFinishedPulling="2026-03-18 12:22:57.708147014 +0000 UTC m=+797.258067653" observedRunningTime="2026-03-18 12:22:58.363983361 +0000 UTC m=+797.913904010" watchObservedRunningTime="2026-03-18 12:22:58.366192403 +0000 UTC m=+797.916113052" Mar 18 12:23:05 crc kubenswrapper[4921]: I0318 12:23:05.870395 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:23:05 crc kubenswrapper[4921]: I0318 12:23:05.870957 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:23:05 crc kubenswrapper[4921]: I0318 12:23:05.918208 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:23:06 crc kubenswrapper[4921]: I0318 12:23:06.423045 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:23:06 crc kubenswrapper[4921]: I0318 12:23:06.480061 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-629h6"] Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.398860 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-629h6" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="registry-server" containerID="cri-o://098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4" gracePeriod=2 Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.761197 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.952015 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-utilities\") pod \"bca50ce0-aef5-49b8-a070-14e2d83841bc\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.952459 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dhwj\" (UniqueName: \"kubernetes.io/projected/bca50ce0-aef5-49b8-a070-14e2d83841bc-kube-api-access-2dhwj\") pod \"bca50ce0-aef5-49b8-a070-14e2d83841bc\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.952591 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-catalog-content\") pod \"bca50ce0-aef5-49b8-a070-14e2d83841bc\" (UID: \"bca50ce0-aef5-49b8-a070-14e2d83841bc\") " Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.953621 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-utilities" (OuterVolumeSpecName: "utilities") pod "bca50ce0-aef5-49b8-a070-14e2d83841bc" (UID: "bca50ce0-aef5-49b8-a070-14e2d83841bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.958716 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca50ce0-aef5-49b8-a070-14e2d83841bc-kube-api-access-2dhwj" (OuterVolumeSpecName: "kube-api-access-2dhwj") pod "bca50ce0-aef5-49b8-a070-14e2d83841bc" (UID: "bca50ce0-aef5-49b8-a070-14e2d83841bc"). InnerVolumeSpecName "kube-api-access-2dhwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:23:08 crc kubenswrapper[4921]: I0318 12:23:08.985530 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bca50ce0-aef5-49b8-a070-14e2d83841bc" (UID: "bca50ce0-aef5-49b8-a070-14e2d83841bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.054220 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.054279 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dhwj\" (UniqueName: \"kubernetes.io/projected/bca50ce0-aef5-49b8-a070-14e2d83841bc-kube-api-access-2dhwj\") on node \"crc\" DevicePath \"\"" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.054295 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca50ce0-aef5-49b8-a070-14e2d83841bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.410157 4921 generic.go:334] "Generic (PLEG): container finished" podID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerID="098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4" exitCode=0 Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.410237 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-629h6" event={"ID":"bca50ce0-aef5-49b8-a070-14e2d83841bc","Type":"ContainerDied","Data":"098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4"} Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.410289 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-629h6" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.410318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-629h6" event={"ID":"bca50ce0-aef5-49b8-a070-14e2d83841bc","Type":"ContainerDied","Data":"363aaf5fd96b69429fc8e21c6c866704a124b3d76b56ef3a7109b07b8fa03388"} Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.410355 4921 scope.go:117] "RemoveContainer" containerID="098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.444952 4921 scope.go:117] "RemoveContainer" containerID="4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.450082 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-629h6"] Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.456773 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-629h6"] Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.463576 4921 scope.go:117] "RemoveContainer" containerID="b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.478534 4921 scope.go:117] "RemoveContainer" containerID="098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4" Mar 18 12:23:09 crc kubenswrapper[4921]: E0318 12:23:09.479063 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4\": container with ID starting with 098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4 not found: ID does not exist" containerID="098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.479091 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4"} err="failed to get container status \"098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4\": rpc error: code = NotFound desc = could not find container \"098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4\": container with ID starting with 098d603a21aa0dd00e1b6e5bd2a568bb2ddec2e8cad9a9e078166a1c066fc2a4 not found: ID does not exist" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.479131 4921 scope.go:117] "RemoveContainer" containerID="4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10" Mar 18 12:23:09 crc kubenswrapper[4921]: E0318 12:23:09.479613 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10\": container with ID starting with 4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10 not found: ID does not exist" containerID="4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.479654 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10"} err="failed to get container status \"4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10\": rpc error: code = NotFound desc = could not find container \"4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10\": container with ID starting with 4643446e46aba37a06fc1680da36b1a2e06532162e1802f3d364ac7ab7891b10 not found: ID does not exist" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.479681 4921 scope.go:117] "RemoveContainer" containerID="b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10" Mar 18 12:23:09 crc kubenswrapper[4921]: E0318 12:23:09.479991 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10\": container with ID starting with b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10 not found: ID does not exist" containerID="b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10" Mar 18 12:23:09 crc kubenswrapper[4921]: I0318 12:23:09.480049 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10"} err="failed to get container status \"b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10\": rpc error: code = NotFound desc = could not find container \"b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10\": container with ID starting with b19ffa52839888bd5496539798ee66da264e52838ad0dda43f8fbd28005cbc10 not found: ID does not exist" Mar 18 12:23:11 crc kubenswrapper[4921]: I0318 12:23:11.217353 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" path="/var/lib/kubelet/pods/bca50ce0-aef5-49b8-a070-14e2d83841bc/volumes" Mar 18 12:23:17 crc kubenswrapper[4921]: I0318 12:23:17.081899 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:23:17 crc kubenswrapper[4921]: I0318 12:23:17.082384 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.288610 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jt5tj"] Mar 18 12:23:46 crc kubenswrapper[4921]: E0318 12:23:46.289698 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="registry-server" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.289725 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="registry-server" Mar 18 12:23:46 crc kubenswrapper[4921]: E0318 12:23:46.289748 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="extract-content" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.289762 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="extract-content" Mar 18 12:23:46 crc kubenswrapper[4921]: E0318 12:23:46.289794 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="extract-utilities" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.289820 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="extract-utilities" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.290028 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca50ce0-aef5-49b8-a070-14e2d83841bc" containerName="registry-server" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.291336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.296436 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jt5tj"] Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.356794 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-catalog-content\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.356870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcvnv\" (UniqueName: \"kubernetes.io/projected/e0f69305-792e-4b17-af69-d947e02ff1fe-kube-api-access-bcvnv\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.357146 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-utilities\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.457897 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-utilities\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.457958 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-catalog-content\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.458427 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-catalog-content\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.458420 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcvnv\" (UniqueName: \"kubernetes.io/projected/e0f69305-792e-4b17-af69-d947e02ff1fe-kube-api-access-bcvnv\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.458522 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-utilities\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.483796 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcvnv\" (UniqueName: \"kubernetes.io/projected/e0f69305-792e-4b17-af69-d947e02ff1fe-kube-api-access-bcvnv\") pod \"certified-operators-jt5tj\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:46 crc kubenswrapper[4921]: I0318 12:23:46.743698 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.011975 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jt5tj"] Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.081742 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.081845 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.081915 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.082912 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"766d7dfa17c3a2f9b917da54f565e4c9feb5034e40822c9f54b662ea59c3b6dc"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.083024 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://766d7dfa17c3a2f9b917da54f565e4c9feb5034e40822c9f54b662ea59c3b6dc" gracePeriod=600 Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.625650 4921 generic.go:334] "Generic (PLEG): container finished" podID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerID="78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca" exitCode=0 Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.625861 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt5tj" event={"ID":"e0f69305-792e-4b17-af69-d947e02ff1fe","Type":"ContainerDied","Data":"78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca"} Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.626049 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt5tj" event={"ID":"e0f69305-792e-4b17-af69-d947e02ff1fe","Type":"ContainerStarted","Data":"6e2041291c2c8e32c4f92e5ebba18cfd63d0edaa10dc274a289830dfd1afe6ab"} Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.627879 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.630240 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="766d7dfa17c3a2f9b917da54f565e4c9feb5034e40822c9f54b662ea59c3b6dc" exitCode=0 Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.630279 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"766d7dfa17c3a2f9b917da54f565e4c9feb5034e40822c9f54b662ea59c3b6dc"} Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.630341 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"ebcc7bf1aa6f60def18576e51eaa04202bf67a3ba2c684f5b12ee3391d160ae7"} Mar 18 12:23:47 crc kubenswrapper[4921]: I0318 12:23:47.630362 4921 scope.go:117] "RemoveContainer" containerID="fc5074ee409b0ac06bf766e43a02ba25eb50275e9168fd38e188e49a4ed789f7" Mar 18 12:23:49 crc kubenswrapper[4921]: I0318 12:23:49.655397 4921 generic.go:334] "Generic (PLEG): container finished" podID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerID="e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955" exitCode=0 Mar 18 12:23:49 crc kubenswrapper[4921]: I0318 12:23:49.655578 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt5tj" event={"ID":"e0f69305-792e-4b17-af69-d947e02ff1fe","Type":"ContainerDied","Data":"e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955"} Mar 18 12:23:50 crc kubenswrapper[4921]: I0318 12:23:50.663516 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt5tj" event={"ID":"e0f69305-792e-4b17-af69-d947e02ff1fe","Type":"ContainerStarted","Data":"a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672"} Mar 18 12:23:50 crc kubenswrapper[4921]: I0318 12:23:50.682755 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jt5tj" podStartSLOduration=2.175890869 podStartE2EDuration="4.682731552s" podCreationTimestamp="2026-03-18 12:23:46 +0000 UTC" firstStartedPulling="2026-03-18 12:23:47.627632932 +0000 UTC m=+847.177553571" lastFinishedPulling="2026-03-18 12:23:50.134473615 +0000 UTC m=+849.684394254" observedRunningTime="2026-03-18 12:23:50.682545396 +0000 UTC m=+850.232466055" watchObservedRunningTime="2026-03-18 12:23:50.682731552 +0000 UTC m=+850.232652201" Mar 18 12:23:56 crc kubenswrapper[4921]: I0318 12:23:56.744429 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:56 crc kubenswrapper[4921]: I0318 12:23:56.744865 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:56 crc kubenswrapper[4921]: I0318 12:23:56.790648 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:57 crc kubenswrapper[4921]: I0318 12:23:57.740858 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:23:57 crc kubenswrapper[4921]: I0318 12:23:57.785867 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jt5tj"] Mar 18 12:23:59 crc kubenswrapper[4921]: I0318 12:23:59.714067 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jt5tj" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="registry-server" containerID="cri-o://a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672" gracePeriod=2 Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.026193 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.134755 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-catalog-content\") pod \"e0f69305-792e-4b17-af69-d947e02ff1fe\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.135152 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-utilities\") pod \"e0f69305-792e-4b17-af69-d947e02ff1fe\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.135221 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcvnv\" (UniqueName: \"kubernetes.io/projected/e0f69305-792e-4b17-af69-d947e02ff1fe-kube-api-access-bcvnv\") pod \"e0f69305-792e-4b17-af69-d947e02ff1fe\" (UID: \"e0f69305-792e-4b17-af69-d947e02ff1fe\") " Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.135881 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-utilities" (OuterVolumeSpecName: "utilities") pod "e0f69305-792e-4b17-af69-d947e02ff1fe" (UID: "e0f69305-792e-4b17-af69-d947e02ff1fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.142795 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f69305-792e-4b17-af69-d947e02ff1fe-kube-api-access-bcvnv" (OuterVolumeSpecName: "kube-api-access-bcvnv") pod "e0f69305-792e-4b17-af69-d947e02ff1fe" (UID: "e0f69305-792e-4b17-af69-d947e02ff1fe"). InnerVolumeSpecName "kube-api-access-bcvnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.154799 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563944-v2nk7"] Mar 18 12:24:00 crc kubenswrapper[4921]: E0318 12:24:00.155059 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="extract-utilities" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.155079 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="extract-utilities" Mar 18 12:24:00 crc kubenswrapper[4921]: E0318 12:24:00.155104 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="extract-content" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.155130 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="extract-content" Mar 18 12:24:00 crc kubenswrapper[4921]: E0318 12:24:00.155144 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="registry-server" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.155152 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="registry-server" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.155273 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerName="registry-server" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.155787 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.157865 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.158488 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-v2nk7"] Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.158788 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.158794 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.197035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0f69305-792e-4b17-af69-d947e02ff1fe" (UID: "e0f69305-792e-4b17-af69-d947e02ff1fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.236915 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.236960 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0f69305-792e-4b17-af69-d947e02ff1fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.236976 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcvnv\" (UniqueName: \"kubernetes.io/projected/e0f69305-792e-4b17-af69-d947e02ff1fe-kube-api-access-bcvnv\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.337645 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bz5\" (UniqueName: \"kubernetes.io/projected/9dc9b502-7736-4dc5-afcc-2d422bce4266-kube-api-access-45bz5\") pod \"auto-csr-approver-29563944-v2nk7\" (UID: \"9dc9b502-7736-4dc5-afcc-2d422bce4266\") " pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.438520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45bz5\" (UniqueName: \"kubernetes.io/projected/9dc9b502-7736-4dc5-afcc-2d422bce4266-kube-api-access-45bz5\") pod \"auto-csr-approver-29563944-v2nk7\" (UID: \"9dc9b502-7736-4dc5-afcc-2d422bce4266\") " pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.456326 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bz5\" (UniqueName: \"kubernetes.io/projected/9dc9b502-7736-4dc5-afcc-2d422bce4266-kube-api-access-45bz5\") pod \"auto-csr-approver-29563944-v2nk7\" (UID: \"9dc9b502-7736-4dc5-afcc-2d422bce4266\") " pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.476805 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.647760 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-v2nk7"] Mar 18 12:24:00 crc kubenswrapper[4921]: W0318 12:24:00.654884 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dc9b502_7736_4dc5_afcc_2d422bce4266.slice/crio-c72af14a052c735905504a0b7e227d3d2b72dbcc6ac098eeb0711af7a85a2053 WatchSource:0}: Error finding container c72af14a052c735905504a0b7e227d3d2b72dbcc6ac098eeb0711af7a85a2053: Status 404 returned error can't find the container with id c72af14a052c735905504a0b7e227d3d2b72dbcc6ac098eeb0711af7a85a2053 Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.725014 4921 generic.go:334] "Generic (PLEG): container finished" podID="e0f69305-792e-4b17-af69-d947e02ff1fe" containerID="a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672" exitCode=0 Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.725066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt5tj" event={"ID":"e0f69305-792e-4b17-af69-d947e02ff1fe","Type":"ContainerDied","Data":"a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672"} Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.725352 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jt5tj" event={"ID":"e0f69305-792e-4b17-af69-d947e02ff1fe","Type":"ContainerDied","Data":"6e2041291c2c8e32c4f92e5ebba18cfd63d0edaa10dc274a289830dfd1afe6ab"} Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.725373 4921 scope.go:117] "RemoveContainer" containerID="a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.725124 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jt5tj" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.726885 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" event={"ID":"9dc9b502-7736-4dc5-afcc-2d422bce4266","Type":"ContainerStarted","Data":"c72af14a052c735905504a0b7e227d3d2b72dbcc6ac098eeb0711af7a85a2053"} Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.742262 4921 scope.go:117] "RemoveContainer" containerID="e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.755138 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jt5tj"] Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.758633 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jt5tj"] Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.778065 4921 scope.go:117] "RemoveContainer" containerID="78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.790244 4921 scope.go:117] "RemoveContainer" containerID="a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672" Mar 18 12:24:00 crc kubenswrapper[4921]: E0318 12:24:00.790549 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672\": container with ID starting with a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672 not found: ID does not exist" containerID="a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.790582 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672"} err="failed to get container status \"a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672\": rpc error: code = NotFound desc = could not find container \"a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672\": container with ID starting with a6a1fa483e6d693a593f352346ef487134f219d242be436f67e50b878c1b6672 not found: ID does not exist" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.790601 4921 scope.go:117] "RemoveContainer" containerID="e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955" Mar 18 12:24:00 crc kubenswrapper[4921]: E0318 12:24:00.790797 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955\": container with ID starting with e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955 not found: ID does not exist" containerID="e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.790818 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955"} err="failed to get container status \"e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955\": rpc error: code = NotFound desc = could not find container \"e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955\": container with ID starting with e5ec11cff4303dfab0a4976d49c0e4de90c3db1c6a7e676ac247461568187955 not found: ID does not exist" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.790833 4921 scope.go:117] "RemoveContainer" containerID="78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca" Mar 18 12:24:00 crc kubenswrapper[4921]: E0318 12:24:00.791030 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca\": container with ID starting with 78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca not found: ID does not exist" containerID="78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca" Mar 18 12:24:00 crc kubenswrapper[4921]: I0318 12:24:00.791054 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca"} err="failed to get container status \"78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca\": rpc error: code = NotFound desc = could not find container \"78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca\": container with ID starting with 78bcfc27d17e1374790e0c8783e8986f80041a3f46fdeff2e5ffe6a7365785ca not found: ID does not exist" Mar 18 12:24:01 crc kubenswrapper[4921]: I0318 12:24:01.216615 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f69305-792e-4b17-af69-d947e02ff1fe" path="/var/lib/kubelet/pods/e0f69305-792e-4b17-af69-d947e02ff1fe/volumes" Mar 18 12:24:02 crc kubenswrapper[4921]: I0318 12:24:02.746433 4921 generic.go:334] "Generic (PLEG): container finished" podID="9dc9b502-7736-4dc5-afcc-2d422bce4266" containerID="872a95c472a62a31d048e26afdf3b12b40482aa2dcc422059ad01003c581709c" exitCode=0 Mar 18 12:24:02 crc kubenswrapper[4921]: I0318 12:24:02.746514 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" event={"ID":"9dc9b502-7736-4dc5-afcc-2d422bce4266","Type":"ContainerDied","Data":"872a95c472a62a31d048e26afdf3b12b40482aa2dcc422059ad01003c581709c"} Mar 18 12:24:03 crc kubenswrapper[4921]: I0318 12:24:03.987365 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:04 crc kubenswrapper[4921]: I0318 12:24:04.090272 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45bz5\" (UniqueName: \"kubernetes.io/projected/9dc9b502-7736-4dc5-afcc-2d422bce4266-kube-api-access-45bz5\") pod \"9dc9b502-7736-4dc5-afcc-2d422bce4266\" (UID: \"9dc9b502-7736-4dc5-afcc-2d422bce4266\") " Mar 18 12:24:04 crc kubenswrapper[4921]: I0318 12:24:04.689813 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc9b502-7736-4dc5-afcc-2d422bce4266-kube-api-access-45bz5" (OuterVolumeSpecName: "kube-api-access-45bz5") pod "9dc9b502-7736-4dc5-afcc-2d422bce4266" (UID: "9dc9b502-7736-4dc5-afcc-2d422bce4266"). InnerVolumeSpecName "kube-api-access-45bz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:24:04 crc kubenswrapper[4921]: I0318 12:24:04.701971 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45bz5\" (UniqueName: \"kubernetes.io/projected/9dc9b502-7736-4dc5-afcc-2d422bce4266-kube-api-access-45bz5\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:04 crc kubenswrapper[4921]: I0318 12:24:04.759377 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" event={"ID":"9dc9b502-7736-4dc5-afcc-2d422bce4266","Type":"ContainerDied","Data":"c72af14a052c735905504a0b7e227d3d2b72dbcc6ac098eeb0711af7a85a2053"} Mar 18 12:24:04 crc kubenswrapper[4921]: I0318 12:24:04.759436 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c72af14a052c735905504a0b7e227d3d2b72dbcc6ac098eeb0711af7a85a2053" Mar 18 12:24:04 crc kubenswrapper[4921]: I0318 12:24:04.759439 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-v2nk7" Mar 18 12:24:05 crc kubenswrapper[4921]: I0318 12:24:05.041206 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-4vb5p"] Mar 18 12:24:05 crc kubenswrapper[4921]: I0318 12:24:05.045242 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-4vb5p"] Mar 18 12:24:05 crc kubenswrapper[4921]: I0318 12:24:05.221292 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a16bce2-a835-450d-9101-c9cbd238b42b" path="/var/lib/kubelet/pods/9a16bce2-a835-450d-9101-c9cbd238b42b/volumes" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.352653 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l6tb7"] Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354057 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-controller" containerID="cri-o://b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354151 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="nbdb" containerID="cri-o://14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354256 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="northd" containerID="cri-o://692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354310 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354353 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-node" containerID="cri-o://f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354396 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-acl-logging" containerID="cri-o://72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.354738 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="sbdb" containerID="cri-o://eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.407084 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovnkube-controller" containerID="cri-o://cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" gracePeriod=30 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.713637 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6tb7_357e939f-66df-4ef0-b64a-a846abdd1ecf/ovn-acl-logging/0.log" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.714966 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6tb7_357e939f-66df-4ef0-b64a-a846abdd1ecf/ovn-controller/0.log" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.715651 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720042 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-ovn\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720125 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720321 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-log-socket\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-systemd\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720437 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-log-socket" (OuterVolumeSpecName: "log-socket") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720450 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-bin\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720549 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-node-log\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720621 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-ovn-kubernetes\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720648 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720669 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-node-log" (OuterVolumeSpecName: "node-log") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720705 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720702 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720742 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-etc-openvswitch\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.720821 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721050 4921 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721084 4921 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721097 4921 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721160 4921 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721178 4921 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721187 4921 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.721196 4921 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.736294 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.781964 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sk7wg"] Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782220 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="sbdb" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782235 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="sbdb" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782242 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kubecfg-setup" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782249 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kubecfg-setup" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782260 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="nbdb" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782266 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="nbdb" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782273 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-node" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782280 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-node" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782288 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-acl-logging" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782297 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-acl-logging" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782309 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovnkube-controller" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782318 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovnkube-controller" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782330 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-controller" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782338 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-controller" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782345 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782354 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782365 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="northd" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782373 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="northd" Mar 18 12:24:29 crc kubenswrapper[4921]: E0318 12:24:29.782393 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc9b502-7736-4dc5-afcc-2d422bce4266" containerName="oc" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782401 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc9b502-7736-4dc5-afcc-2d422bce4266" containerName="oc" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782511 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc9b502-7736-4dc5-afcc-2d422bce4266" containerName="oc" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782524 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovnkube-controller" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782536 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-node" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782549 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="nbdb" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782558 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-controller" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782569 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="ovn-acl-logging" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782578 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="sbdb" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.782597 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerName="northd" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.784641 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.821927 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-netd\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822002 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-openvswitch\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnh75\" (UniqueName: \"kubernetes.io/projected/357e939f-66df-4ef0-b64a-a846abdd1ecf-kube-api-access-gnh75\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822069 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-config\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822095 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-env-overrides\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822182 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-netns\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822262 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822740 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-systemd-units\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822789 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-kubelet\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822813 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-var-lib-openvswitch\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822731 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822770 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822879 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822966 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.822905 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-script-lib\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823005 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-slash\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823029 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovn-node-metrics-cert\") pod \"357e939f-66df-4ef0-b64a-a846abdd1ecf\" (UID: \"357e939f-66df-4ef0-b64a-a846abdd1ecf\") " Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823351 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823371 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzj9z\" (UniqueName: \"kubernetes.io/projected/471c4129-a49d-4b79-b357-b240242129e6-kube-api-access-dzj9z\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823390 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823413 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-slash" (OuterVolumeSpecName: "host-slash") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823445 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-var-lib-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823474 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-kubelet\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823496 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823531 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-cni-netd\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823552 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-node-log\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823574 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-systemd-units\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-env-overrides\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823625 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-ovn\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823660 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-etc-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823676 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823714 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-ovnkube-config\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823738 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/471c4129-a49d-4b79-b357-b240242129e6-ovn-node-metrics-cert\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823839 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-slash\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823841 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823880 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-log-socket\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823928 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-run-netns\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.823984 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-systemd\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824010 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-ovnkube-script-lib\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-cni-bin\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824088 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824227 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824252 4921 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824265 4921 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824281 4921 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824293 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824308 4921 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/357e939f-66df-4ef0-b64a-a846abdd1ecf-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824318 4921 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824329 4921 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824340 4921 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824352 4921 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.824366 4921 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/357e939f-66df-4ef0-b64a-a846abdd1ecf-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.827203 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.827421 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357e939f-66df-4ef0-b64a-a846abdd1ecf-kube-api-access-gnh75" (OuterVolumeSpecName: "kube-api-access-gnh75") pod "357e939f-66df-4ef0-b64a-a846abdd1ecf" (UID: "357e939f-66df-4ef0-b64a-a846abdd1ecf"). InnerVolumeSpecName "kube-api-access-gnh75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.908057 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gkdzx_888e124c-ec0f-4c32-bd78-1ff258933bde/kube-multus/0.log" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.908142 4921 generic.go:334] "Generic (PLEG): container finished" podID="888e124c-ec0f-4c32-bd78-1ff258933bde" containerID="1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878" exitCode=2 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.908268 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gkdzx" event={"ID":"888e124c-ec0f-4c32-bd78-1ff258933bde","Type":"ContainerDied","Data":"1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.909142 4921 scope.go:117] "RemoveContainer" containerID="1e60813e86ebbd510e3a55e735eaff7ea1a5d7d6650c4daa664679d3c382f878" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926723 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926789 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzj9z\" (UniqueName: \"kubernetes.io/projected/471c4129-a49d-4b79-b357-b240242129e6-kube-api-access-dzj9z\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-var-lib-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926847 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-kubelet\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926867 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926897 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926908 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-node-log\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926977 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-var-lib-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927019 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-cni-netd\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927081 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-systemd-units\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927244 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-cni-netd\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.926960 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-node-log\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927280 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-kubelet\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927306 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-systemd-units\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927395 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-env-overrides\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927418 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-ovn\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927446 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-etc-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927482 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-ovnkube-config\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/471c4129-a49d-4b79-b357-b240242129e6-ovn-node-metrics-cert\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927524 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-slash\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927545 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-log-socket\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927562 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927563 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-ovn\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927603 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-run-netns\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927626 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-etc-openvswitch\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-log-socket\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928430 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-slash\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928561 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-ovnkube-config\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.927583 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-run-netns\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-systemd\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928691 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-ovnkube-script-lib\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928745 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-cni-bin\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928906 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/357e939f-66df-4ef0-b64a-a846abdd1ecf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.928935 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnh75\" (UniqueName: \"kubernetes.io/projected/357e939f-66df-4ef0-b64a-a846abdd1ecf-kube-api-access-gnh75\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.929062 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-env-overrides\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.929222 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-run-systemd\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.929637 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.929717 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/471c4129-a49d-4b79-b357-b240242129e6-ovnkube-script-lib\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.929827 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/471c4129-a49d-4b79-b357-b240242129e6-host-cni-bin\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.932074 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6tb7_357e939f-66df-4ef0-b64a-a846abdd1ecf/ovn-acl-logging/0.log" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.946832 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/471c4129-a49d-4b79-b357-b240242129e6-ovn-node-metrics-cert\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.947927 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6tb7_357e939f-66df-4ef0-b64a-a846abdd1ecf/ovn-controller/0.log" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.948921 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" exitCode=0 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.948997 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" exitCode=0 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949028 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" exitCode=0 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949052 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" exitCode=0 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949066 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" exitCode=0 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949079 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" exitCode=0 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949091 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" exitCode=143 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949105 4921 generic.go:334] "Generic (PLEG): container finished" podID="357e939f-66df-4ef0-b64a-a846abdd1ecf" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" exitCode=143 Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949172 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949257 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949292 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949324 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949352 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949379 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949424 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949450 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949466 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949487 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949509 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949527 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949542 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949557 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949573 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949588 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949603 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949618 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949633 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949655 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949679 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949698 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949879 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949902 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949917 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949930 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949940 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949954 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949964 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.949986 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" event={"ID":"357e939f-66df-4ef0-b64a-a846abdd1ecf","Type":"ContainerDied","Data":"1302961abac4154ed57ee62ff75a67eaca79098b936d587d758e976384f2c48e"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950005 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950018 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950028 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950038 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950049 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950059 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950069 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950081 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950091 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950180 4921 scope.go:117] "RemoveContainer" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.950590 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6tb7" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.954248 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzj9z\" (UniqueName: \"kubernetes.io/projected/471c4129-a49d-4b79-b357-b240242129e6-kube-api-access-dzj9z\") pod \"ovnkube-node-sk7wg\" (UID: \"471c4129-a49d-4b79-b357-b240242129e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:29 crc kubenswrapper[4921]: I0318 12:24:29.985659 4921 scope.go:117] "RemoveContainer" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.013082 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l6tb7"] Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.017597 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l6tb7"] Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.019893 4921 scope.go:117] "RemoveContainer" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.034862 4921 scope.go:117] "RemoveContainer" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.050358 4921 scope.go:117] "RemoveContainer" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.068853 4921 scope.go:117] "RemoveContainer" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.083529 4921 scope.go:117] "RemoveContainer" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.096370 4921 scope.go:117] "RemoveContainer" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.100314 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.123660 4921 scope.go:117] "RemoveContainer" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.154047 4921 scope.go:117] "RemoveContainer" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.154628 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": container with ID starting with cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d not found: ID does not exist" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.154697 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} err="failed to get container status \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": rpc error: code = NotFound desc = could not find container \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": container with ID starting with cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.154727 4921 scope.go:117] "RemoveContainer" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.155140 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": container with ID starting with eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3 not found: ID does not exist" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.155209 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} err="failed to get container status \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": rpc error: code = NotFound desc = could not find container \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": container with ID starting with eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.155266 4921 scope.go:117] "RemoveContainer" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.155691 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": container with ID starting with 14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f not found: ID does not exist" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.155741 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} err="failed to get container status \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": rpc error: code = NotFound desc = could not find container \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": container with ID starting with 14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.155766 4921 scope.go:117] "RemoveContainer" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.156017 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": container with ID starting with 692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13 not found: ID does not exist" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.156068 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} err="failed to get container status \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": rpc error: code = NotFound desc = could not find container \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": container with ID starting with 692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.156096 4921 scope.go:117] "RemoveContainer" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.156374 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": container with ID starting with 009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c not found: ID does not exist" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.156405 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} err="failed to get container status \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": rpc error: code = NotFound desc = could not find container \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": container with ID starting with 009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.156424 4921 scope.go:117] "RemoveContainer" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.156941 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": container with ID starting with f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3 not found: ID does not exist" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.156976 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} err="failed to get container status \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": rpc error: code = NotFound desc = could not find container \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": container with ID starting with f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.156994 4921 scope.go:117] "RemoveContainer" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.157554 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": container with ID starting with 72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095 not found: ID does not exist" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.157645 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} err="failed to get container status \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": rpc error: code = NotFound desc = could not find container \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": container with ID starting with 72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.157693 4921 scope.go:117] "RemoveContainer" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.158163 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": container with ID starting with b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90 not found: ID does not exist" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.158207 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} err="failed to get container status \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": rpc error: code = NotFound desc = could not find container \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": container with ID starting with b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.158238 4921 scope.go:117] "RemoveContainer" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" Mar 18 12:24:30 crc kubenswrapper[4921]: E0318 12:24:30.158622 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": container with ID starting with 6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3 not found: ID does not exist" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.158663 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} err="failed to get container status \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": rpc error: code = NotFound desc = could not find container \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": container with ID starting with 6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.158687 4921 scope.go:117] "RemoveContainer" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.159051 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} err="failed to get container status \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": rpc error: code = NotFound desc = could not find container \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": container with ID starting with cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.159086 4921 scope.go:117] "RemoveContainer" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.159569 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} err="failed to get container status \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": rpc error: code = NotFound desc = could not find container \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": container with ID starting with eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.159599 4921 scope.go:117] "RemoveContainer" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.159868 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} err="failed to get container status \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": rpc error: code = NotFound desc = could not find container \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": container with ID starting with 14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.159893 4921 scope.go:117] "RemoveContainer" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.160839 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} err="failed to get container status \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": rpc error: code = NotFound desc = could not find container \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": container with ID starting with 692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.160870 4921 scope.go:117] "RemoveContainer" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.161192 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} err="failed to get container status \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": rpc error: code = NotFound desc = could not find container \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": container with ID starting with 009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.161222 4921 scope.go:117] "RemoveContainer" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.161505 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} err="failed to get container status \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": rpc error: code = NotFound desc = could not find container \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": container with ID starting with f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.161537 4921 scope.go:117] "RemoveContainer" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.161773 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} err="failed to get container status \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": rpc error: code = NotFound desc = could not find container \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": container with ID starting with 72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.161835 4921 scope.go:117] "RemoveContainer" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.162085 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} err="failed to get container status \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": rpc error: code = NotFound desc = could not find container \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": container with ID starting with b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.162179 4921 scope.go:117] "RemoveContainer" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.162464 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} err="failed to get container status \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": rpc error: code = NotFound desc = could not find container \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": container with ID starting with 6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.162507 4921 scope.go:117] "RemoveContainer" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.163042 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} err="failed to get container status \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": rpc error: code = NotFound desc = could not find container \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": container with ID starting with cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.163082 4921 scope.go:117] "RemoveContainer" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.163486 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} err="failed to get container status \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": rpc error: code = NotFound desc = could not find container \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": container with ID starting with eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.163521 4921 scope.go:117] "RemoveContainer" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.163824 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} err="failed to get container status \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": rpc error: code = NotFound desc = could not find container \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": container with ID starting with 14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.163852 4921 scope.go:117] "RemoveContainer" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.164155 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} err="failed to get container status \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": rpc error: code = NotFound desc = could not find container \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": container with ID starting with 692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.164183 4921 scope.go:117] "RemoveContainer" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.164508 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} err="failed to get container status \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": rpc error: code = NotFound desc = could not find container \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": container with ID starting with 009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.164541 4921 scope.go:117] "RemoveContainer" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.164876 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} err="failed to get container status \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": rpc error: code = NotFound desc = could not find container \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": container with ID starting with f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.164952 4921 scope.go:117] "RemoveContainer" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.165304 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} err="failed to get container status \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": rpc error: code = NotFound desc = could not find container \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": container with ID starting with 72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.165336 4921 scope.go:117] "RemoveContainer" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.165621 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} err="failed to get container status \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": rpc error: code = NotFound desc = could not find container \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": container with ID starting with b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.165666 4921 scope.go:117] "RemoveContainer" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.166123 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} err="failed to get container status \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": rpc error: code = NotFound desc = could not find container \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": container with ID starting with 6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.166152 4921 scope.go:117] "RemoveContainer" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.166457 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} err="failed to get container status \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": rpc error: code = NotFound desc = could not find container \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": container with ID starting with cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.166500 4921 scope.go:117] "RemoveContainer" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.166829 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} err="failed to get container status \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": rpc error: code = NotFound desc = could not find container \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": container with ID starting with eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.166901 4921 scope.go:117] "RemoveContainer" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.167360 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} err="failed to get container status \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": rpc error: code = NotFound desc = could not find container \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": container with ID starting with 14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.167400 4921 scope.go:117] "RemoveContainer" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.167706 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} err="failed to get container status \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": rpc error: code = NotFound desc = could not find container \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": container with ID starting with 692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.167751 4921 scope.go:117] "RemoveContainer" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.168086 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} err="failed to get container status \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": rpc error: code = NotFound desc = could not find container \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": container with ID starting with 009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.168350 4921 scope.go:117] "RemoveContainer" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.168817 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} err="failed to get container status \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": rpc error: code = NotFound desc = could not find container \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": container with ID starting with f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.168852 4921 scope.go:117] "RemoveContainer" containerID="72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.169490 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095"} err="failed to get container status \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": rpc error: code = NotFound desc = could not find container \"72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095\": container with ID starting with 72032078fe7e053ec42dedd8efbba065986ff47906ff64be9476d5cdc5e53095 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.169522 4921 scope.go:117] "RemoveContainer" containerID="b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.169777 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90"} err="failed to get container status \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": rpc error: code = NotFound desc = could not find container \"b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90\": container with ID starting with b26a9742e2f1f03f1aa85a7aaa9791a5c3e2916fe1bb00650e2e50c40f085e90 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.169808 4921 scope.go:117] "RemoveContainer" containerID="6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.170031 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3"} err="failed to get container status \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": rpc error: code = NotFound desc = could not find container \"6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3\": container with ID starting with 6805fe564145b21713125c3e4c32b55486a823cb266d84d5567b66b30b64c6e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.170064 4921 scope.go:117] "RemoveContainer" containerID="cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.170345 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d"} err="failed to get container status \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": rpc error: code = NotFound desc = could not find container \"cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d\": container with ID starting with cec6e08b906535f4f1ca5dfa4345a93c7bea8b0c190cf1d9a8701abf99e1be3d not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.170374 4921 scope.go:117] "RemoveContainer" containerID="eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.170778 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3"} err="failed to get container status \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": rpc error: code = NotFound desc = could not find container \"eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3\": container with ID starting with eb947bfd23646ab96d7b8c66cddbc606a38ba574f3a0d70c21cedc45ac14d7e3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.170808 4921 scope.go:117] "RemoveContainer" containerID="14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.171103 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f"} err="failed to get container status \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": rpc error: code = NotFound desc = could not find container \"14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f\": container with ID starting with 14cb39b7e703b551138d350fc9283c5be0eef6b9c001a75b53414828ccf70a1f not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.171151 4921 scope.go:117] "RemoveContainer" containerID="692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.171558 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13"} err="failed to get container status \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": rpc error: code = NotFound desc = could not find container \"692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13\": container with ID starting with 692441c8dce5f11dd4a1731bf16b910ca3aa39d944e97a66bbcbf5a1c0fbcd13 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.171591 4921 scope.go:117] "RemoveContainer" containerID="009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.171930 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c"} err="failed to get container status \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": rpc error: code = NotFound desc = could not find container \"009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c\": container with ID starting with 009bc2004237c69b432f0e1cf3f066b0e2185f201220dd60e4f0297c27aebe1c not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.171959 4921 scope.go:117] "RemoveContainer" containerID="f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.172324 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3"} err="failed to get container status \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": rpc error: code = NotFound desc = could not find container \"f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3\": container with ID starting with f0422d3f92adaa90b14a3308bf81ec44ba757f77269cbfb0c22af0ed7c62bed3 not found: ID does not exist" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.958448 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gkdzx_888e124c-ec0f-4c32-bd78-1ff258933bde/kube-multus/0.log" Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.959017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gkdzx" event={"ID":"888e124c-ec0f-4c32-bd78-1ff258933bde","Type":"ContainerStarted","Data":"f406274645c20df3686eab33d8230345a580dce76ad8a989839d41abb8d3927e"} Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.962654 4921 generic.go:334] "Generic (PLEG): container finished" podID="471c4129-a49d-4b79-b357-b240242129e6" containerID="9b50c44fef384e977e5c5f15602412543ecd6fc8ed9d69f85c8659d928a392db" exitCode=0 Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.962761 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerDied","Data":"9b50c44fef384e977e5c5f15602412543ecd6fc8ed9d69f85c8659d928a392db"} Mar 18 12:24:30 crc kubenswrapper[4921]: I0318 12:24:30.962804 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"42cfb03a7c061e0a33ede467534054d7bba445a2f7add10188a064d579867a43"} Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.218437 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357e939f-66df-4ef0-b64a-a846abdd1ecf" path="/var/lib/kubelet/pods/357e939f-66df-4ef0-b64a-a846abdd1ecf/volumes" Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.973666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"6dfc12f17b9afd422d3b440f730c449a422be962d3e125834942ef573cd48432"} Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.974141 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"77ad85aec231a2991906bebb81b37fa356ef40f944b3839cd959bd02c35b8d80"} Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.974156 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"0ca54e0a57973dbd7dbc45f8a1d58d3e70714d3dfd0654a6f972078bc0c64ee4"} Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.974168 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"d13a00b4db0ffb2ec230b3ac58c990d6e854e9706eb18de783084c757bd94ace"} Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.974179 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"bc1e0969cd957299c44a1ee2be75d3b65bba6fba2ddada16260106da83752726"} Mar 18 12:24:31 crc kubenswrapper[4921]: I0318 12:24:31.974188 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"143f6319938dc73cc0983a443faaa1e0bec2004c46fd6c48fcd062b3cc69c298"} Mar 18 12:24:33 crc kubenswrapper[4921]: I0318 12:24:33.989378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"f1acfdf53cc3af21eba5273a7acd46f4e5aa9e7cae868e40e06b6f82abe1578b"} Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.243661 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-d5qjc"] Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.244593 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.246928 4921 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-rlgxp" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.247522 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.247523 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.247719 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.316383 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7nf\" (UniqueName: \"kubernetes.io/projected/a7d0865c-c0b1-486c-8e28-92a275c035b6-kube-api-access-7b7nf\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.316459 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7d0865c-c0b1-486c-8e28-92a275c035b6-node-mnt\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.316482 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7d0865c-c0b1-486c-8e28-92a275c035b6-crc-storage\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.417889 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7d0865c-c0b1-486c-8e28-92a275c035b6-node-mnt\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.417938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7d0865c-c0b1-486c-8e28-92a275c035b6-crc-storage\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.417995 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7nf\" (UniqueName: \"kubernetes.io/projected/a7d0865c-c0b1-486c-8e28-92a275c035b6-kube-api-access-7b7nf\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.418271 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7d0865c-c0b1-486c-8e28-92a275c035b6-node-mnt\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.419527 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7d0865c-c0b1-486c-8e28-92a275c035b6-crc-storage\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.437044 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7nf\" (UniqueName: \"kubernetes.io/projected/a7d0865c-c0b1-486c-8e28-92a275c035b6-kube-api-access-7b7nf\") pod \"crc-storage-crc-d5qjc\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: I0318 12:24:35.562055 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: E0318 12:24:35.591340 4921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(45f14d889767229dfbf64c1fedfb9d418ecf3b919834fa22c75cc6da23305a3e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:24:35 crc kubenswrapper[4921]: E0318 12:24:35.591423 4921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(45f14d889767229dfbf64c1fedfb9d418ecf3b919834fa22c75cc6da23305a3e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: E0318 12:24:35.591450 4921 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(45f14d889767229dfbf64c1fedfb9d418ecf3b919834fa22c75cc6da23305a3e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:35 crc kubenswrapper[4921]: E0318 12:24:35.591502 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-d5qjc_crc-storage(a7d0865c-c0b1-486c-8e28-92a275c035b6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-d5qjc_crc-storage(a7d0865c-c0b1-486c-8e28-92a275c035b6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(45f14d889767229dfbf64c1fedfb9d418ecf3b919834fa22c75cc6da23305a3e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-d5qjc" podUID="a7d0865c-c0b1-486c-8e28-92a275c035b6" Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.010606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" event={"ID":"471c4129-a49d-4b79-b357-b240242129e6","Type":"ContainerStarted","Data":"6d448b74c9796fa1b6c63f2e1ad174f635fe4efb263317d2289cd599714b67d7"} Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.011306 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.011323 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.011333 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.038065 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.040659 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:24:37 crc kubenswrapper[4921]: I0318 12:24:37.054381 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" podStartSLOduration=8.054349926 podStartE2EDuration="8.054349926s" podCreationTimestamp="2026-03-18 12:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:24:37.051009471 +0000 UTC m=+896.600930120" watchObservedRunningTime="2026-03-18 12:24:37.054349926 +0000 UTC m=+896.604270595" Mar 18 12:24:39 crc kubenswrapper[4921]: I0318 12:24:39.386810 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-d5qjc"] Mar 18 12:24:39 crc kubenswrapper[4921]: I0318 12:24:39.387248 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:39 crc kubenswrapper[4921]: I0318 12:24:39.387615 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:39 crc kubenswrapper[4921]: E0318 12:24:39.414998 4921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(299ad2b512698fff4056c70c3ac53a3ae44bf80cf8c311f8a0f81cfe94d0ad22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:24:39 crc kubenswrapper[4921]: E0318 12:24:39.415077 4921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(299ad2b512698fff4056c70c3ac53a3ae44bf80cf8c311f8a0f81cfe94d0ad22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:39 crc kubenswrapper[4921]: E0318 12:24:39.415121 4921 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(299ad2b512698fff4056c70c3ac53a3ae44bf80cf8c311f8a0f81cfe94d0ad22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:39 crc kubenswrapper[4921]: E0318 12:24:39.415192 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-d5qjc_crc-storage(a7d0865c-c0b1-486c-8e28-92a275c035b6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-d5qjc_crc-storage(a7d0865c-c0b1-486c-8e28-92a275c035b6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-d5qjc_crc-storage_a7d0865c-c0b1-486c-8e28-92a275c035b6_0(299ad2b512698fff4056c70c3ac53a3ae44bf80cf8c311f8a0f81cfe94d0ad22): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-d5qjc" podUID="a7d0865c-c0b1-486c-8e28-92a275c035b6" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.220686 4921 scope.go:117] "RemoveContainer" containerID="c418b070ea1c94a129f2580a7157f13eb622ed973a4c14b4e4f56dffe0ae31f9" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.487446 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9wnvx"] Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.489364 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.501930 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wnvx"] Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.593522 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-utilities\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.593677 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-catalog-content\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.593738 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtclf\" (UniqueName: \"kubernetes.io/projected/59b3de21-b428-43f6-9783-517fda99e679-kube-api-access-vtclf\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.695010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-utilities\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.695138 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-catalog-content\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.695176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtclf\" (UniqueName: \"kubernetes.io/projected/59b3de21-b428-43f6-9783-517fda99e679-kube-api-access-vtclf\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.695576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-utilities\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.695608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-catalog-content\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.724280 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtclf\" (UniqueName: \"kubernetes.io/projected/59b3de21-b428-43f6-9783-517fda99e679-kube-api-access-vtclf\") pod \"redhat-operators-9wnvx\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:48 crc kubenswrapper[4921]: I0318 12:24:48.815401 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:49 crc kubenswrapper[4921]: I0318 12:24:49.037236 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9wnvx"] Mar 18 12:24:49 crc kubenswrapper[4921]: W0318 12:24:49.039539 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59b3de21_b428_43f6_9783_517fda99e679.slice/crio-98ca1746308388ed1b947706e24afe8492ccbf14341af31b6ee2d2a2b8fbfc3f WatchSource:0}: Error finding container 98ca1746308388ed1b947706e24afe8492ccbf14341af31b6ee2d2a2b8fbfc3f: Status 404 returned error can't find the container with id 98ca1746308388ed1b947706e24afe8492ccbf14341af31b6ee2d2a2b8fbfc3f Mar 18 12:24:49 crc kubenswrapper[4921]: I0318 12:24:49.109507 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerStarted","Data":"98ca1746308388ed1b947706e24afe8492ccbf14341af31b6ee2d2a2b8fbfc3f"} Mar 18 12:24:50 crc kubenswrapper[4921]: I0318 12:24:50.118205 4921 generic.go:334] "Generic (PLEG): container finished" podID="59b3de21-b428-43f6-9783-517fda99e679" containerID="6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7" exitCode=0 Mar 18 12:24:50 crc kubenswrapper[4921]: I0318 12:24:50.118474 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerDied","Data":"6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7"} Mar 18 12:24:51 crc kubenswrapper[4921]: I0318 12:24:51.129042 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerStarted","Data":"8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b"} Mar 18 12:24:52 crc kubenswrapper[4921]: I0318 12:24:52.137614 4921 generic.go:334] "Generic (PLEG): container finished" podID="59b3de21-b428-43f6-9783-517fda99e679" containerID="8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b" exitCode=0 Mar 18 12:24:52 crc kubenswrapper[4921]: I0318 12:24:52.137687 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerDied","Data":"8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b"} Mar 18 12:24:52 crc kubenswrapper[4921]: I0318 12:24:52.208656 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:52 crc kubenswrapper[4921]: I0318 12:24:52.209444 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:52 crc kubenswrapper[4921]: I0318 12:24:52.639176 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-d5qjc"] Mar 18 12:24:52 crc kubenswrapper[4921]: W0318 12:24:52.650181 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d0865c_c0b1_486c_8e28_92a275c035b6.slice/crio-376622487ffa792d55c6aa7a2fa7179079df3a9f8205f73f2fb78898c25f3a5e WatchSource:0}: Error finding container 376622487ffa792d55c6aa7a2fa7179079df3a9f8205f73f2fb78898c25f3a5e: Status 404 returned error can't find the container with id 376622487ffa792d55c6aa7a2fa7179079df3a9f8205f73f2fb78898c25f3a5e Mar 18 12:24:53 crc kubenswrapper[4921]: I0318 12:24:53.145873 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d5qjc" event={"ID":"a7d0865c-c0b1-486c-8e28-92a275c035b6","Type":"ContainerStarted","Data":"376622487ffa792d55c6aa7a2fa7179079df3a9f8205f73f2fb78898c25f3a5e"} Mar 18 12:24:54 crc kubenswrapper[4921]: I0318 12:24:54.156365 4921 generic.go:334] "Generic (PLEG): container finished" podID="a7d0865c-c0b1-486c-8e28-92a275c035b6" containerID="fdbcb2743b54a0e1c0974fb26a0cef13d8a15b40737f6b8c1e9edc268cfbd6d3" exitCode=0 Mar 18 12:24:54 crc kubenswrapper[4921]: I0318 12:24:54.156439 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d5qjc" event={"ID":"a7d0865c-c0b1-486c-8e28-92a275c035b6","Type":"ContainerDied","Data":"fdbcb2743b54a0e1c0974fb26a0cef13d8a15b40737f6b8c1e9edc268cfbd6d3"} Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.167224 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerStarted","Data":"face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8"} Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.189453 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9wnvx" podStartSLOduration=3.311911037 podStartE2EDuration="7.189435088s" podCreationTimestamp="2026-03-18 12:24:48 +0000 UTC" firstStartedPulling="2026-03-18 12:24:50.121303979 +0000 UTC m=+909.671224628" lastFinishedPulling="2026-03-18 12:24:53.99882804 +0000 UTC m=+913.548748679" observedRunningTime="2026-03-18 12:24:55.186573207 +0000 UTC m=+914.736493846" watchObservedRunningTime="2026-03-18 12:24:55.189435088 +0000 UTC m=+914.739355727" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.396696 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.476705 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7d0865c-c0b1-486c-8e28-92a275c035b6-crc-storage\") pod \"a7d0865c-c0b1-486c-8e28-92a275c035b6\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.476745 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7d0865c-c0b1-486c-8e28-92a275c035b6-node-mnt\") pod \"a7d0865c-c0b1-486c-8e28-92a275c035b6\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.476818 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7nf\" (UniqueName: \"kubernetes.io/projected/a7d0865c-c0b1-486c-8e28-92a275c035b6-kube-api-access-7b7nf\") pod \"a7d0865c-c0b1-486c-8e28-92a275c035b6\" (UID: \"a7d0865c-c0b1-486c-8e28-92a275c035b6\") " Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.477258 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7d0865c-c0b1-486c-8e28-92a275c035b6-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a7d0865c-c0b1-486c-8e28-92a275c035b6" (UID: "a7d0865c-c0b1-486c-8e28-92a275c035b6"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.482450 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d0865c-c0b1-486c-8e28-92a275c035b6-kube-api-access-7b7nf" (OuterVolumeSpecName: "kube-api-access-7b7nf") pod "a7d0865c-c0b1-486c-8e28-92a275c035b6" (UID: "a7d0865c-c0b1-486c-8e28-92a275c035b6"). InnerVolumeSpecName "kube-api-access-7b7nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.498040 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d0865c-c0b1-486c-8e28-92a275c035b6-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a7d0865c-c0b1-486c-8e28-92a275c035b6" (UID: "a7d0865c-c0b1-486c-8e28-92a275c035b6"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.577588 4921 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7d0865c-c0b1-486c-8e28-92a275c035b6-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.577799 4921 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7d0865c-c0b1-486c-8e28-92a275c035b6-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:55 crc kubenswrapper[4921]: I0318 12:24:55.577892 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7nf\" (UniqueName: \"kubernetes.io/projected/a7d0865c-c0b1-486c-8e28-92a275c035b6-kube-api-access-7b7nf\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:56 crc kubenswrapper[4921]: I0318 12:24:56.176189 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d5qjc" event={"ID":"a7d0865c-c0b1-486c-8e28-92a275c035b6","Type":"ContainerDied","Data":"376622487ffa792d55c6aa7a2fa7179079df3a9f8205f73f2fb78898c25f3a5e"} Mar 18 12:24:56 crc kubenswrapper[4921]: I0318 12:24:56.176909 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="376622487ffa792d55c6aa7a2fa7179079df3a9f8205f73f2fb78898c25f3a5e" Mar 18 12:24:56 crc kubenswrapper[4921]: I0318 12:24:56.176243 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d5qjc" Mar 18 12:24:58 crc kubenswrapper[4921]: I0318 12:24:58.816050 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:58 crc kubenswrapper[4921]: I0318 12:24:58.816829 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:24:59 crc kubenswrapper[4921]: I0318 12:24:59.849215 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9wnvx" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="registry-server" probeResult="failure" output=< Mar 18 12:24:59 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 12:24:59 crc kubenswrapper[4921]: > Mar 18 12:25:00 crc kubenswrapper[4921]: I0318 12:25:00.131711 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sk7wg" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.171321 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz"] Mar 18 12:25:03 crc kubenswrapper[4921]: E0318 12:25:03.172177 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d0865c-c0b1-486c-8e28-92a275c035b6" containerName="storage" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.172195 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d0865c-c0b1-486c-8e28-92a275c035b6" containerName="storage" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.172316 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d0865c-c0b1-486c-8e28-92a275c035b6" containerName="storage" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.173255 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.185349 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz"] Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.185955 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.273039 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94csf\" (UniqueName: \"kubernetes.io/projected/af17cbc8-f48a-499e-af36-8f4420f43e4d-kube-api-access-94csf\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.273130 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.273180 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.374709 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.374776 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94csf\" (UniqueName: \"kubernetes.io/projected/af17cbc8-f48a-499e-af36-8f4420f43e4d-kube-api-access-94csf\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.374858 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.375398 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.375522 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.393164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94csf\" (UniqueName: \"kubernetes.io/projected/af17cbc8-f48a-499e-af36-8f4420f43e4d-kube-api-access-94csf\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.524310 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:03 crc kubenswrapper[4921]: I0318 12:25:03.925824 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz"] Mar 18 12:25:04 crc kubenswrapper[4921]: I0318 12:25:04.220821 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" event={"ID":"af17cbc8-f48a-499e-af36-8f4420f43e4d","Type":"ContainerStarted","Data":"320ab6a5fb3c5d964010fab27c1d3d3cb152cc60276e58201e033fe2a6cfa9b0"} Mar 18 12:25:04 crc kubenswrapper[4921]: I0318 12:25:04.221097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" event={"ID":"af17cbc8-f48a-499e-af36-8f4420f43e4d","Type":"ContainerStarted","Data":"6fc44232d4842ab3b111a902fe4ee1ac02481ad992f79aea0256ce2935d210e7"} Mar 18 12:25:05 crc kubenswrapper[4921]: I0318 12:25:05.231809 4921 generic.go:334] "Generic (PLEG): container finished" podID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerID="320ab6a5fb3c5d964010fab27c1d3d3cb152cc60276e58201e033fe2a6cfa9b0" exitCode=0 Mar 18 12:25:05 crc kubenswrapper[4921]: I0318 12:25:05.231875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" event={"ID":"af17cbc8-f48a-499e-af36-8f4420f43e4d","Type":"ContainerDied","Data":"320ab6a5fb3c5d964010fab27c1d3d3cb152cc60276e58201e033fe2a6cfa9b0"} Mar 18 12:25:07 crc kubenswrapper[4921]: I0318 12:25:07.254772 4921 generic.go:334] "Generic (PLEG): container finished" podID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerID="543cf8c4e66d71159daa09edddd38a2dd6a893f762d53b3bf11bc7feeea117dc" exitCode=0 Mar 18 12:25:07 crc kubenswrapper[4921]: I0318 12:25:07.255052 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" event={"ID":"af17cbc8-f48a-499e-af36-8f4420f43e4d","Type":"ContainerDied","Data":"543cf8c4e66d71159daa09edddd38a2dd6a893f762d53b3bf11bc7feeea117dc"} Mar 18 12:25:08 crc kubenswrapper[4921]: I0318 12:25:08.265320 4921 generic.go:334] "Generic (PLEG): container finished" podID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerID="4aea96271a1b4af4f2ba6f20fb4df7d8214369326c2704168a5c14ffc2dac390" exitCode=0 Mar 18 12:25:08 crc kubenswrapper[4921]: I0318 12:25:08.265383 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" event={"ID":"af17cbc8-f48a-499e-af36-8f4420f43e4d","Type":"ContainerDied","Data":"4aea96271a1b4af4f2ba6f20fb4df7d8214369326c2704168a5c14ffc2dac390"} Mar 18 12:25:08 crc kubenswrapper[4921]: I0318 12:25:08.851515 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:25:08 crc kubenswrapper[4921]: I0318 12:25:08.898571 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.519335 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.655189 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94csf\" (UniqueName: \"kubernetes.io/projected/af17cbc8-f48a-499e-af36-8f4420f43e4d-kube-api-access-94csf\") pod \"af17cbc8-f48a-499e-af36-8f4420f43e4d\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.655265 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-bundle\") pod \"af17cbc8-f48a-499e-af36-8f4420f43e4d\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.655438 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-util\") pod \"af17cbc8-f48a-499e-af36-8f4420f43e4d\" (UID: \"af17cbc8-f48a-499e-af36-8f4420f43e4d\") " Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.658355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-bundle" (OuterVolumeSpecName: "bundle") pod "af17cbc8-f48a-499e-af36-8f4420f43e4d" (UID: "af17cbc8-f48a-499e-af36-8f4420f43e4d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.665924 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af17cbc8-f48a-499e-af36-8f4420f43e4d-kube-api-access-94csf" (OuterVolumeSpecName: "kube-api-access-94csf") pod "af17cbc8-f48a-499e-af36-8f4420f43e4d" (UID: "af17cbc8-f48a-499e-af36-8f4420f43e4d"). InnerVolumeSpecName "kube-api-access-94csf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.671169 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-util" (OuterVolumeSpecName: "util") pod "af17cbc8-f48a-499e-af36-8f4420f43e4d" (UID: "af17cbc8-f48a-499e-af36-8f4420f43e4d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.757382 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.757441 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94csf\" (UniqueName: \"kubernetes.io/projected/af17cbc8-f48a-499e-af36-8f4420f43e4d-kube-api-access-94csf\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.757456 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af17cbc8-f48a-499e-af36-8f4420f43e4d-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:09 crc kubenswrapper[4921]: I0318 12:25:09.777219 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wnvx"] Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.282217 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.282200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz" event={"ID":"af17cbc8-f48a-499e-af36-8f4420f43e4d","Type":"ContainerDied","Data":"6fc44232d4842ab3b111a902fe4ee1ac02481ad992f79aea0256ce2935d210e7"} Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.282276 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc44232d4842ab3b111a902fe4ee1ac02481ad992f79aea0256ce2935d210e7" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.282339 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9wnvx" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="registry-server" containerID="cri-o://face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8" gracePeriod=2 Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.654843 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.769815 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-utilities\") pod \"59b3de21-b428-43f6-9783-517fda99e679\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.769938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtclf\" (UniqueName: \"kubernetes.io/projected/59b3de21-b428-43f6-9783-517fda99e679-kube-api-access-vtclf\") pod \"59b3de21-b428-43f6-9783-517fda99e679\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.770002 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-catalog-content\") pod \"59b3de21-b428-43f6-9783-517fda99e679\" (UID: \"59b3de21-b428-43f6-9783-517fda99e679\") " Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.770981 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-utilities" (OuterVolumeSpecName: "utilities") pod "59b3de21-b428-43f6-9783-517fda99e679" (UID: "59b3de21-b428-43f6-9783-517fda99e679"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.774371 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b3de21-b428-43f6-9783-517fda99e679-kube-api-access-vtclf" (OuterVolumeSpecName: "kube-api-access-vtclf") pod "59b3de21-b428-43f6-9783-517fda99e679" (UID: "59b3de21-b428-43f6-9783-517fda99e679"). InnerVolumeSpecName "kube-api-access-vtclf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.872170 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.872216 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtclf\" (UniqueName: \"kubernetes.io/projected/59b3de21-b428-43f6-9783-517fda99e679-kube-api-access-vtclf\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.902904 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59b3de21-b428-43f6-9783-517fda99e679" (UID: "59b3de21-b428-43f6-9783-517fda99e679"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:25:10 crc kubenswrapper[4921]: I0318 12:25:10.973050 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b3de21-b428-43f6-9783-517fda99e679-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.292438 4921 generic.go:334] "Generic (PLEG): container finished" podID="59b3de21-b428-43f6-9783-517fda99e679" containerID="face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8" exitCode=0 Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.292505 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerDied","Data":"face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8"} Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.292539 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9wnvx" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.292571 4921 scope.go:117] "RemoveContainer" containerID="face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.292553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9wnvx" event={"ID":"59b3de21-b428-43f6-9783-517fda99e679","Type":"ContainerDied","Data":"98ca1746308388ed1b947706e24afe8492ccbf14341af31b6ee2d2a2b8fbfc3f"} Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.316300 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9wnvx"] Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.319880 4921 scope.go:117] "RemoveContainer" containerID="8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.321933 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9wnvx"] Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.341933 4921 scope.go:117] "RemoveContainer" containerID="6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.372761 4921 scope.go:117] "RemoveContainer" containerID="face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8" Mar 18 12:25:11 crc kubenswrapper[4921]: E0318 12:25:11.373313 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8\": container with ID starting with face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8 not found: ID does not exist" containerID="face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.373361 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8"} err="failed to get container status \"face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8\": rpc error: code = NotFound desc = could not find container \"face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8\": container with ID starting with face95a1162e6917ffff2d26e9c4236169bb9cb2430cb802f85e72c911164de8 not found: ID does not exist" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.373393 4921 scope.go:117] "RemoveContainer" containerID="8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b" Mar 18 12:25:11 crc kubenswrapper[4921]: E0318 12:25:11.373818 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b\": container with ID starting with 8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b not found: ID does not exist" containerID="8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.373845 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b"} err="failed to get container status \"8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b\": rpc error: code = NotFound desc = could not find container \"8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b\": container with ID starting with 8d4c44ff03ca5ed68afa92f4e74a9a2da92c3d3c218892835250ffa42296451b not found: ID does not exist" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.373863 4921 scope.go:117] "RemoveContainer" containerID="6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7" Mar 18 12:25:11 crc kubenswrapper[4921]: E0318 12:25:11.374239 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7\": container with ID starting with 6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7 not found: ID does not exist" containerID="6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7" Mar 18 12:25:11 crc kubenswrapper[4921]: I0318 12:25:11.374283 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7"} err="failed to get container status \"6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7\": rpc error: code = NotFound desc = could not find container \"6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7\": container with ID starting with 6a12e382d682bf0d6faedc62951ccda210307dc5e6603e1f341f02b30254fdd7 not found: ID does not exist" Mar 18 12:25:13 crc kubenswrapper[4921]: I0318 12:25:13.217866 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b3de21-b428-43f6-9783-517fda99e679" path="/var/lib/kubelet/pods/59b3de21-b428-43f6-9783-517fda99e679/volumes" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545268 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hw58p"] Mar 18 12:25:14 crc kubenswrapper[4921]: E0318 12:25:14.545589 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="util" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545604 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="util" Mar 18 12:25:14 crc kubenswrapper[4921]: E0318 12:25:14.545621 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="extract" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545627 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="extract" Mar 18 12:25:14 crc kubenswrapper[4921]: E0318 12:25:14.545639 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="extract-utilities" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545649 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="extract-utilities" Mar 18 12:25:14 crc kubenswrapper[4921]: E0318 12:25:14.545661 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="extract-content" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545667 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="extract-content" Mar 18 12:25:14 crc kubenswrapper[4921]: E0318 12:25:14.545762 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="registry-server" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545769 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="registry-server" Mar 18 12:25:14 crc kubenswrapper[4921]: E0318 12:25:14.545785 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="pull" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545792 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="pull" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545915 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="af17cbc8-f48a-499e-af36-8f4420f43e4d" containerName="extract" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.545932 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b3de21-b428-43f6-9783-517fda99e679" containerName="registry-server" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.546486 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.548220 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dslt8" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.550266 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.557886 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.560059 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hw58p"] Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.623306 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbcxx\" (UniqueName: \"kubernetes.io/projected/c961752a-a620-4a3b-bbb3-10da43ae4a59-kube-api-access-kbcxx\") pod \"nmstate-operator-796d4cfff4-hw58p\" (UID: \"c961752a-a620-4a3b-bbb3-10da43ae4a59\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.724297 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbcxx\" (UniqueName: \"kubernetes.io/projected/c961752a-a620-4a3b-bbb3-10da43ae4a59-kube-api-access-kbcxx\") pod \"nmstate-operator-796d4cfff4-hw58p\" (UID: \"c961752a-a620-4a3b-bbb3-10da43ae4a59\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.741100 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbcxx\" (UniqueName: \"kubernetes.io/projected/c961752a-a620-4a3b-bbb3-10da43ae4a59-kube-api-access-kbcxx\") pod \"nmstate-operator-796d4cfff4-hw58p\" (UID: \"c961752a-a620-4a3b-bbb3-10da43ae4a59\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" Mar 18 12:25:14 crc kubenswrapper[4921]: I0318 12:25:14.899979 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" Mar 18 12:25:15 crc kubenswrapper[4921]: I0318 12:25:15.124295 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hw58p"] Mar 18 12:25:15 crc kubenswrapper[4921]: I0318 12:25:15.318461 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" event={"ID":"c961752a-a620-4a3b-bbb3-10da43ae4a59","Type":"ContainerStarted","Data":"e45a22633da54f716c77e8d3ee04a5c12ddf17322e4e8d88e30643a1f63b1182"} Mar 18 12:25:18 crc kubenswrapper[4921]: I0318 12:25:18.338092 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" event={"ID":"c961752a-a620-4a3b-bbb3-10da43ae4a59","Type":"ContainerStarted","Data":"1438d65386de961c5ec78c708a38736b7959aae393ab98e13328424ef671ef4e"} Mar 18 12:25:18 crc kubenswrapper[4921]: I0318 12:25:18.360834 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hw58p" podStartSLOduration=2.094038202 podStartE2EDuration="4.360805916s" podCreationTimestamp="2026-03-18 12:25:14 +0000 UTC" firstStartedPulling="2026-03-18 12:25:15.13067804 +0000 UTC m=+934.680598689" lastFinishedPulling="2026-03-18 12:25:17.397445764 +0000 UTC m=+936.947366403" observedRunningTime="2026-03-18 12:25:18.359796257 +0000 UTC m=+937.909716966" watchObservedRunningTime="2026-03-18 12:25:18.360805916 +0000 UTC m=+937.910726585" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.014203 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.015385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.018255 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vmn2h" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.025049 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.058104 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.058975 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.063129 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.070181 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qvc7c"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.071130 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.123392 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.138155 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/11630f15-be33-4a14-9100-4d20eace4502-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.138202 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nml72\" (UniqueName: \"kubernetes.io/projected/3d40f204-447a-4c6d-b289-8b9d21583b02-kube-api-access-nml72\") pod \"nmstate-metrics-9b8c8685d-pfrql\" (UID: \"3d40f204-447a-4c6d-b289-8b9d21583b02\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.138234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmnpf\" (UniqueName: \"kubernetes.io/projected/11630f15-be33-4a14-9100-4d20eace4502-kube-api-access-pmnpf\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.172961 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.173654 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.175484 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5z6wr" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.178129 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.178159 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.221989 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.239803 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/11630f15-be33-4a14-9100-4d20eace4502-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.239866 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nml72\" (UniqueName: \"kubernetes.io/projected/3d40f204-447a-4c6d-b289-8b9d21583b02-kube-api-access-nml72\") pod \"nmstate-metrics-9b8c8685d-pfrql\" (UID: \"3d40f204-447a-4c6d-b289-8b9d21583b02\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.239895 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-dbus-socket\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.239934 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmnpf\" (UniqueName: \"kubernetes.io/projected/11630f15-be33-4a14-9100-4d20eace4502-kube-api-access-pmnpf\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.239967 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwjn\" (UniqueName: \"kubernetes.io/projected/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-kube-api-access-rbwjn\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: E0318 12:25:23.239972 4921 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.239999 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-nmstate-lock\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: E0318 12:25:23.240039 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11630f15-be33-4a14-9100-4d20eace4502-tls-key-pair podName:11630f15-be33-4a14-9100-4d20eace4502 nodeName:}" failed. No retries permitted until 2026-03-18 12:25:23.740022323 +0000 UTC m=+943.289942962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/11630f15-be33-4a14-9100-4d20eace4502-tls-key-pair") pod "nmstate-webhook-5f558f5558-w8sm5" (UID: "11630f15-be33-4a14-9100-4d20eace4502") : secret "openshift-nmstate-webhook" not found Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.240064 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-ovs-socket\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.259708 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmnpf\" (UniqueName: \"kubernetes.io/projected/11630f15-be33-4a14-9100-4d20eace4502-kube-api-access-pmnpf\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.262939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nml72\" (UniqueName: \"kubernetes.io/projected/3d40f204-447a-4c6d-b289-8b9d21583b02-kube-api-access-nml72\") pod \"nmstate-metrics-9b8c8685d-pfrql\" (UID: \"3d40f204-447a-4c6d-b289-8b9d21583b02\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.338442 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.344993 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-dbus-socket\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdscb\" (UniqueName: \"kubernetes.io/projected/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-kube-api-access-gdscb\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345085 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345128 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwjn\" (UniqueName: \"kubernetes.io/projected/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-kube-api-access-rbwjn\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345165 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-nmstate-lock\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345225 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-ovs-socket\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345313 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-ovs-socket\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345367 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-nmstate-lock\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.345717 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-dbus-socket\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.369862 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwjn\" (UniqueName: \"kubernetes.io/projected/a59a7883-d2d3-4f0f-bef7-afc18a6ab54e-kube-api-access-rbwjn\") pod \"nmstate-handler-qvc7c\" (UID: \"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e\") " pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.387626 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.410845 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ffbdfddfb-sn8sf"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.411460 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.424935 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffbdfddfb-sn8sf"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.445951 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.446050 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdscb\" (UniqueName: \"kubernetes.io/projected/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-kube-api-access-gdscb\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.446088 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.447028 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.449930 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.465187 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdscb\" (UniqueName: \"kubernetes.io/projected/e5a4eba4-f1c1-41ee-ac96-55385b0b77b4-kube-api-access-gdscb\") pod \"nmstate-console-plugin-86f58fcf4-l5tc4\" (UID: \"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.488379 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-trusted-ca-bundle\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547278 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8mx6\" (UniqueName: \"kubernetes.io/projected/db357e3a-b998-4441-8624-38baccd8b6bf-kube-api-access-w8mx6\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547310 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-console-config\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547352 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-oauth-serving-cert\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547465 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db357e3a-b998-4441-8624-38baccd8b6bf-console-serving-cert\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547575 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db357e3a-b998-4441-8624-38baccd8b6bf-console-oauth-config\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.547640 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-service-ca\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.597948 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql"] Mar 18 12:25:23 crc kubenswrapper[4921]: W0318 12:25:23.603928 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d40f204_447a_4c6d_b289_8b9d21583b02.slice/crio-c2179a992e37c00e5e45d2ac77f36bbde87af03d82101f84cd824be65b96af3d WatchSource:0}: Error finding container c2179a992e37c00e5e45d2ac77f36bbde87af03d82101f84cd824be65b96af3d: Status 404 returned error can't find the container with id c2179a992e37c00e5e45d2ac77f36bbde87af03d82101f84cd824be65b96af3d Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.648826 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db357e3a-b998-4441-8624-38baccd8b6bf-console-oauth-config\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.648882 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-service-ca\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.648919 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-trusted-ca-bundle\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.648940 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8mx6\" (UniqueName: \"kubernetes.io/projected/db357e3a-b998-4441-8624-38baccd8b6bf-kube-api-access-w8mx6\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.648963 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-console-config\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.648987 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-oauth-serving-cert\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.649015 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db357e3a-b998-4441-8624-38baccd8b6bf-console-serving-cert\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.650289 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-service-ca\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.651865 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-oauth-serving-cert\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.652860 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-trusted-ca-bundle\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.653166 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db357e3a-b998-4441-8624-38baccd8b6bf-console-config\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.654780 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db357e3a-b998-4441-8624-38baccd8b6bf-console-serving-cert\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.655275 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db357e3a-b998-4441-8624-38baccd8b6bf-console-oauth-config\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.665876 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8mx6\" (UniqueName: \"kubernetes.io/projected/db357e3a-b998-4441-8624-38baccd8b6bf-kube-api-access-w8mx6\") pod \"console-6ffbdfddfb-sn8sf\" (UID: \"db357e3a-b998-4441-8624-38baccd8b6bf\") " pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.713451 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4"] Mar 18 12:25:23 crc kubenswrapper[4921]: W0318 12:25:23.720180 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a4eba4_f1c1_41ee_ac96_55385b0b77b4.slice/crio-fe3b4638f05d836fd812a7be968bb46b04b15b38b60419c5465a7e6d3de6e32d WatchSource:0}: Error finding container fe3b4638f05d836fd812a7be968bb46b04b15b38b60419c5465a7e6d3de6e32d: Status 404 returned error can't find the container with id fe3b4638f05d836fd812a7be968bb46b04b15b38b60419c5465a7e6d3de6e32d Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.750644 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.751725 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/11630f15-be33-4a14-9100-4d20eace4502-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.755146 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/11630f15-be33-4a14-9100-4d20eace4502-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sm5\" (UID: \"11630f15-be33-4a14-9100-4d20eace4502\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.976009 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffbdfddfb-sn8sf"] Mar 18 12:25:23 crc kubenswrapper[4921]: I0318 12:25:23.976193 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:23 crc kubenswrapper[4921]: W0318 12:25:23.979612 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb357e3a_b998_4441_8624_38baccd8b6bf.slice/crio-4838e7add0d80d81122c180a4eccf43f8c9255612dabcb0aeec9e51c54f6201b WatchSource:0}: Error finding container 4838e7add0d80d81122c180a4eccf43f8c9255612dabcb0aeec9e51c54f6201b: Status 404 returned error can't find the container with id 4838e7add0d80d81122c180a4eccf43f8c9255612dabcb0aeec9e51c54f6201b Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.164100 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5"] Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.377010 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" event={"ID":"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4","Type":"ContainerStarted","Data":"fe3b4638f05d836fd812a7be968bb46b04b15b38b60419c5465a7e6d3de6e32d"} Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.378279 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" event={"ID":"3d40f204-447a-4c6d-b289-8b9d21583b02","Type":"ContainerStarted","Data":"c2179a992e37c00e5e45d2ac77f36bbde87af03d82101f84cd824be65b96af3d"} Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.379635 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffbdfddfb-sn8sf" event={"ID":"db357e3a-b998-4441-8624-38baccd8b6bf","Type":"ContainerStarted","Data":"3c082b044e4da69e0ee0e1691096e89e5f845cad9b5b71a8790bbdb26530bbfd"} Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.379659 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffbdfddfb-sn8sf" event={"ID":"db357e3a-b998-4441-8624-38baccd8b6bf","Type":"ContainerStarted","Data":"4838e7add0d80d81122c180a4eccf43f8c9255612dabcb0aeec9e51c54f6201b"} Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.380832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" event={"ID":"11630f15-be33-4a14-9100-4d20eace4502","Type":"ContainerStarted","Data":"0e8a3c69aa61d46cee7a8c1a65055159302256b5823b1f8d1dfac15c6a5026f2"} Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.382231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qvc7c" event={"ID":"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e","Type":"ContainerStarted","Data":"cc67deedf898136a7151a706be74b2eadf8f6be9725f15f21a0b529eb0868179"} Mar 18 12:25:24 crc kubenswrapper[4921]: I0318 12:25:24.409309 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ffbdfddfb-sn8sf" podStartSLOduration=1.409293827 podStartE2EDuration="1.409293827s" podCreationTimestamp="2026-03-18 12:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:25:24.405476629 +0000 UTC m=+943.955397268" watchObservedRunningTime="2026-03-18 12:25:24.409293827 +0000 UTC m=+943.959214466" Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.401255 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" event={"ID":"3d40f204-447a-4c6d-b289-8b9d21583b02","Type":"ContainerStarted","Data":"da5fbb49edcbd7222c6d90fcd412204a9872c44bddbed1c867bb6e207b8029af"} Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.403404 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" event={"ID":"11630f15-be33-4a14-9100-4d20eace4502","Type":"ContainerStarted","Data":"3a1339eaf8d70c1bad69f9655906331176103ab6a2eb7aad54d302651cac0946"} Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.404101 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.404688 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qvc7c" event={"ID":"a59a7883-d2d3-4f0f-bef7-afc18a6ab54e","Type":"ContainerStarted","Data":"630dd4048daa373e10c97007898de71625dba555851abb23a1e756b73141cc81"} Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.404807 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.405647 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" event={"ID":"e5a4eba4-f1c1-41ee-ac96-55385b0b77b4","Type":"ContainerStarted","Data":"508701e96c205d79194fbc77e278ff04564c6bb84a0c59f3176c6c54854f60b5"} Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.421492 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" podStartSLOduration=2.275982241 podStartE2EDuration="4.421475011s" podCreationTimestamp="2026-03-18 12:25:23 +0000 UTC" firstStartedPulling="2026-03-18 12:25:24.17284534 +0000 UTC m=+943.722765979" lastFinishedPulling="2026-03-18 12:25:26.31833811 +0000 UTC m=+945.868258749" observedRunningTime="2026-03-18 12:25:27.417339104 +0000 UTC m=+946.967259763" watchObservedRunningTime="2026-03-18 12:25:27.421475011 +0000 UTC m=+946.971395650" Mar 18 12:25:27 crc kubenswrapper[4921]: I0318 12:25:27.441468 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qvc7c" podStartSLOduration=1.551777751 podStartE2EDuration="4.441446946s" podCreationTimestamp="2026-03-18 12:25:23 +0000 UTC" firstStartedPulling="2026-03-18 12:25:23.427881053 +0000 UTC m=+942.977801692" lastFinishedPulling="2026-03-18 12:25:26.317550248 +0000 UTC m=+945.867470887" observedRunningTime="2026-03-18 12:25:27.434770747 +0000 UTC m=+946.984691396" watchObservedRunningTime="2026-03-18 12:25:27.441446946 +0000 UTC m=+946.991367615" Mar 18 12:25:29 crc kubenswrapper[4921]: I0318 12:25:29.418078 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" event={"ID":"3d40f204-447a-4c6d-b289-8b9d21583b02","Type":"ContainerStarted","Data":"ac39e40d864babf3a2ea6859152271583f6d36ec9ba7939f43cf2ad73b9ada74"} Mar 18 12:25:29 crc kubenswrapper[4921]: I0318 12:25:29.434766 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pfrql" podStartSLOduration=1.117131773 podStartE2EDuration="6.434751066s" podCreationTimestamp="2026-03-18 12:25:23 +0000 UTC" firstStartedPulling="2026-03-18 12:25:23.609821236 +0000 UTC m=+943.159741875" lastFinishedPulling="2026-03-18 12:25:28.927440529 +0000 UTC m=+948.477361168" observedRunningTime="2026-03-18 12:25:29.432751769 +0000 UTC m=+948.982672448" watchObservedRunningTime="2026-03-18 12:25:29.434751066 +0000 UTC m=+948.984671705" Mar 18 12:25:29 crc kubenswrapper[4921]: I0318 12:25:29.435283 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-l5tc4" podStartSLOduration=3.839110658 podStartE2EDuration="6.435278101s" podCreationTimestamp="2026-03-18 12:25:23 +0000 UTC" firstStartedPulling="2026-03-18 12:25:23.722449215 +0000 UTC m=+943.272369854" lastFinishedPulling="2026-03-18 12:25:26.318616658 +0000 UTC m=+945.868537297" observedRunningTime="2026-03-18 12:25:27.455381341 +0000 UTC m=+947.005301980" watchObservedRunningTime="2026-03-18 12:25:29.435278101 +0000 UTC m=+948.985198740" Mar 18 12:25:33 crc kubenswrapper[4921]: I0318 12:25:33.416597 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qvc7c" Mar 18 12:25:33 crc kubenswrapper[4921]: I0318 12:25:33.751353 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:33 crc kubenswrapper[4921]: I0318 12:25:33.751401 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:33 crc kubenswrapper[4921]: I0318 12:25:33.759856 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:34 crc kubenswrapper[4921]: I0318 12:25:34.452899 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ffbdfddfb-sn8sf" Mar 18 12:25:34 crc kubenswrapper[4921]: I0318 12:25:34.502459 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8s2wr"] Mar 18 12:25:43 crc kubenswrapper[4921]: I0318 12:25:43.985287 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sm5" Mar 18 12:25:47 crc kubenswrapper[4921]: I0318 12:25:47.081004 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:25:47 crc kubenswrapper[4921]: I0318 12:25:47.081061 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.400843 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns"] Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.402725 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.406128 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.412773 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns"] Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.497444 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.497492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.497554 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnrgh\" (UniqueName: \"kubernetes.io/projected/0a4d9566-78f0-4989-be26-efb5416bcbac-kube-api-access-dnrgh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.598791 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.599282 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.599336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnrgh\" (UniqueName: \"kubernetes.io/projected/0a4d9566-78f0-4989-be26-efb5416bcbac-kube-api-access-dnrgh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.599994 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.600214 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.621464 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnrgh\" (UniqueName: \"kubernetes.io/projected/0a4d9566-78f0-4989-be26-efb5416bcbac-kube-api-access-dnrgh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.720609 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:25:57 crc kubenswrapper[4921]: I0318 12:25:57.911917 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns"] Mar 18 12:25:58 crc kubenswrapper[4921]: I0318 12:25:58.600973 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerID="40670e4b12905088d72ef469d0cb4128696a8dd4c62b29f755320ec7819a8774" exitCode=0 Mar 18 12:25:58 crc kubenswrapper[4921]: I0318 12:25:58.601162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" event={"ID":"0a4d9566-78f0-4989-be26-efb5416bcbac","Type":"ContainerDied","Data":"40670e4b12905088d72ef469d0cb4128696a8dd4c62b29f755320ec7819a8774"} Mar 18 12:25:58 crc kubenswrapper[4921]: I0318 12:25:58.601534 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" event={"ID":"0a4d9566-78f0-4989-be26-efb5416bcbac","Type":"ContainerStarted","Data":"3c225255043c1542c66c792fcf6fcb99f8fab94a281d81aa29ced6ef2eb2c784"} Mar 18 12:25:59 crc kubenswrapper[4921]: I0318 12:25:59.546127 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8s2wr" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerName="console" containerID="cri-o://1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d" gracePeriod=15 Mar 18 12:25:59 crc kubenswrapper[4921]: I0318 12:25:59.925236 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8s2wr_c039ba3b-7b2d-4fdf-bebe-cde4906a71a2/console/0.log" Mar 18 12:25:59 crc kubenswrapper[4921]: I0318 12:25:59.925323 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.033796 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5k6g\" (UniqueName: \"kubernetes.io/projected/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-kube-api-access-h5k6g\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.033897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-oauth-config\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.033963 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-config\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.034035 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-serving-cert\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.034074 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-service-ca\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.034100 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-trusted-ca-bundle\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.034147 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-oauth-serving-cert\") pod \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\" (UID: \"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2\") " Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.035202 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.035239 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.035284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-config" (OuterVolumeSpecName: "console-config") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.035771 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.039946 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.040511 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.046619 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-kube-api-access-h5k6g" (OuterVolumeSpecName: "kube-api-access-h5k6g") pod "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" (UID: "c039ba3b-7b2d-4fdf-bebe-cde4906a71a2"). InnerVolumeSpecName "kube-api-access-h5k6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.135961 4921 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.136010 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.136019 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.136028 4921 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.136036 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5k6g\" (UniqueName: \"kubernetes.io/projected/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-kube-api-access-h5k6g\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.136046 4921 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.136053 4921 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.173198 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563946-nw2kt"] Mar 18 12:26:00 crc kubenswrapper[4921]: E0318 12:26:00.173432 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerName="console" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.173447 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerName="console" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.173557 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerName="console" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.173896 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.176382 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.176652 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.176812 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.183543 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-nw2kt"] Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.236765 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8vj\" (UniqueName: \"kubernetes.io/projected/4029d08d-77da-4e73-a89d-134ed55ea00b-kube-api-access-nh8vj\") pod \"auto-csr-approver-29563946-nw2kt\" (UID: \"4029d08d-77da-4e73-a89d-134ed55ea00b\") " pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.337684 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8vj\" (UniqueName: \"kubernetes.io/projected/4029d08d-77da-4e73-a89d-134ed55ea00b-kube-api-access-nh8vj\") pod \"auto-csr-approver-29563946-nw2kt\" (UID: \"4029d08d-77da-4e73-a89d-134ed55ea00b\") " pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.358212 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8vj\" (UniqueName: \"kubernetes.io/projected/4029d08d-77da-4e73-a89d-134ed55ea00b-kube-api-access-nh8vj\") pod \"auto-csr-approver-29563946-nw2kt\" (UID: \"4029d08d-77da-4e73-a89d-134ed55ea00b\") " pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.491438 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.616894 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8s2wr_c039ba3b-7b2d-4fdf-bebe-cde4906a71a2/console/0.log" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.616958 4921 generic.go:334] "Generic (PLEG): container finished" podID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" containerID="1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d" exitCode=2 Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.616994 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8s2wr" event={"ID":"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2","Type":"ContainerDied","Data":"1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d"} Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.617029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8s2wr" event={"ID":"c039ba3b-7b2d-4fdf-bebe-cde4906a71a2","Type":"ContainerDied","Data":"dc30c8e9ae50031f83fd3b36a8c20907fd3cbf2362e2de4d20e9116dbb43062d"} Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.617051 4921 scope.go:117] "RemoveContainer" containerID="1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.617102 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8s2wr" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.638448 4921 scope.go:117] "RemoveContainer" containerID="1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d" Mar 18 12:26:00 crc kubenswrapper[4921]: E0318 12:26:00.640681 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d\": container with ID starting with 1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d not found: ID does not exist" containerID="1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.640718 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d"} err="failed to get container status \"1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d\": rpc error: code = NotFound desc = could not find container \"1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d\": container with ID starting with 1df5c7a068df5f1b34000d6a51061e2faa8d106737ca018e412b5866994b8e7d not found: ID does not exist" Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.657987 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8s2wr"] Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.658055 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8s2wr"] Mar 18 12:26:00 crc kubenswrapper[4921]: I0318 12:26:00.687858 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-nw2kt"] Mar 18 12:26:00 crc kubenswrapper[4921]: W0318 12:26:00.694448 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4029d08d_77da_4e73_a89d_134ed55ea00b.slice/crio-bbd05640f8a25f9a828f4328f612a83d53fdf12b1169eb22672be12a60e226a4 WatchSource:0}: Error finding container bbd05640f8a25f9a828f4328f612a83d53fdf12b1169eb22672be12a60e226a4: Status 404 returned error can't find the container with id bbd05640f8a25f9a828f4328f612a83d53fdf12b1169eb22672be12a60e226a4 Mar 18 12:26:01 crc kubenswrapper[4921]: I0318 12:26:01.225330 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c039ba3b-7b2d-4fdf-bebe-cde4906a71a2" path="/var/lib/kubelet/pods/c039ba3b-7b2d-4fdf-bebe-cde4906a71a2/volumes" Mar 18 12:26:01 crc kubenswrapper[4921]: I0318 12:26:01.624202 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerID="d2b8c113884ea86a1aa911f87a124b02f575d895dbb713a350cd8f62ded28076" exitCode=0 Mar 18 12:26:01 crc kubenswrapper[4921]: I0318 12:26:01.624293 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" event={"ID":"0a4d9566-78f0-4989-be26-efb5416bcbac","Type":"ContainerDied","Data":"d2b8c113884ea86a1aa911f87a124b02f575d895dbb713a350cd8f62ded28076"} Mar 18 12:26:01 crc kubenswrapper[4921]: I0318 12:26:01.625500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" event={"ID":"4029d08d-77da-4e73-a89d-134ed55ea00b","Type":"ContainerStarted","Data":"bbd05640f8a25f9a828f4328f612a83d53fdf12b1169eb22672be12a60e226a4"} Mar 18 12:26:02 crc kubenswrapper[4921]: I0318 12:26:02.637048 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerID="ad47b2bbc1dd9c9cc5bb352ce3db4cec4f7908c9dd05cc61f7a76e6cee800402" exitCode=0 Mar 18 12:26:02 crc kubenswrapper[4921]: I0318 12:26:02.637180 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" event={"ID":"0a4d9566-78f0-4989-be26-efb5416bcbac","Type":"ContainerDied","Data":"ad47b2bbc1dd9c9cc5bb352ce3db4cec4f7908c9dd05cc61f7a76e6cee800402"} Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.643199 4921 generic.go:334] "Generic (PLEG): container finished" podID="4029d08d-77da-4e73-a89d-134ed55ea00b" containerID="ed442339ea68f65084a0d26f50effbf3acc23f10f1455f1be7d0dfb1631dfec6" exitCode=0 Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.643271 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" event={"ID":"4029d08d-77da-4e73-a89d-134ed55ea00b","Type":"ContainerDied","Data":"ed442339ea68f65084a0d26f50effbf3acc23f10f1455f1be7d0dfb1631dfec6"} Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.855430 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.887065 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-bundle\") pod \"0a4d9566-78f0-4989-be26-efb5416bcbac\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.887184 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnrgh\" (UniqueName: \"kubernetes.io/projected/0a4d9566-78f0-4989-be26-efb5416bcbac-kube-api-access-dnrgh\") pod \"0a4d9566-78f0-4989-be26-efb5416bcbac\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.887221 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-util\") pod \"0a4d9566-78f0-4989-be26-efb5416bcbac\" (UID: \"0a4d9566-78f0-4989-be26-efb5416bcbac\") " Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.888319 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-bundle" (OuterVolumeSpecName: "bundle") pod "0a4d9566-78f0-4989-be26-efb5416bcbac" (UID: "0a4d9566-78f0-4989-be26-efb5416bcbac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.893309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4d9566-78f0-4989-be26-efb5416bcbac-kube-api-access-dnrgh" (OuterVolumeSpecName: "kube-api-access-dnrgh") pod "0a4d9566-78f0-4989-be26-efb5416bcbac" (UID: "0a4d9566-78f0-4989-be26-efb5416bcbac"). InnerVolumeSpecName "kube-api-access-dnrgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.899536 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-util" (OuterVolumeSpecName: "util") pod "0a4d9566-78f0-4989-be26-efb5416bcbac" (UID: "0a4d9566-78f0-4989-be26-efb5416bcbac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.988498 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.988534 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnrgh\" (UniqueName: \"kubernetes.io/projected/0a4d9566-78f0-4989-be26-efb5416bcbac-kube-api-access-dnrgh\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:03 crc kubenswrapper[4921]: I0318 12:26:03.988547 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a4d9566-78f0-4989-be26-efb5416bcbac-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:04 crc kubenswrapper[4921]: I0318 12:26:04.653355 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" event={"ID":"0a4d9566-78f0-4989-be26-efb5416bcbac","Type":"ContainerDied","Data":"3c225255043c1542c66c792fcf6fcb99f8fab94a281d81aa29ced6ef2eb2c784"} Mar 18 12:26:04 crc kubenswrapper[4921]: I0318 12:26:04.653410 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c225255043c1542c66c792fcf6fcb99f8fab94a281d81aa29ced6ef2eb2c784" Mar 18 12:26:04 crc kubenswrapper[4921]: I0318 12:26:04.653409 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns" Mar 18 12:26:04 crc kubenswrapper[4921]: I0318 12:26:04.865623 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:04 crc kubenswrapper[4921]: I0318 12:26:04.901833 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8vj\" (UniqueName: \"kubernetes.io/projected/4029d08d-77da-4e73-a89d-134ed55ea00b-kube-api-access-nh8vj\") pod \"4029d08d-77da-4e73-a89d-134ed55ea00b\" (UID: \"4029d08d-77da-4e73-a89d-134ed55ea00b\") " Mar 18 12:26:04 crc kubenswrapper[4921]: I0318 12:26:04.905234 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4029d08d-77da-4e73-a89d-134ed55ea00b-kube-api-access-nh8vj" (OuterVolumeSpecName: "kube-api-access-nh8vj") pod "4029d08d-77da-4e73-a89d-134ed55ea00b" (UID: "4029d08d-77da-4e73-a89d-134ed55ea00b"). InnerVolumeSpecName "kube-api-access-nh8vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:05 crc kubenswrapper[4921]: I0318 12:26:05.003452 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8vj\" (UniqueName: \"kubernetes.io/projected/4029d08d-77da-4e73-a89d-134ed55ea00b-kube-api-access-nh8vj\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:05 crc kubenswrapper[4921]: I0318 12:26:05.660429 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" event={"ID":"4029d08d-77da-4e73-a89d-134ed55ea00b","Type":"ContainerDied","Data":"bbd05640f8a25f9a828f4328f612a83d53fdf12b1169eb22672be12a60e226a4"} Mar 18 12:26:05 crc kubenswrapper[4921]: I0318 12:26:05.660475 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbd05640f8a25f9a828f4328f612a83d53fdf12b1169eb22672be12a60e226a4" Mar 18 12:26:05 crc kubenswrapper[4921]: I0318 12:26:05.660489 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-nw2kt" Mar 18 12:26:05 crc kubenswrapper[4921]: I0318 12:26:05.915552 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-l9d7b"] Mar 18 12:26:05 crc kubenswrapper[4921]: I0318 12:26:05.921375 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-l9d7b"] Mar 18 12:26:07 crc kubenswrapper[4921]: I0318 12:26:07.222749 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9191a818-dc78-4b2f-801b-d965eeae5c8b" path="/var/lib/kubelet/pods/9191a818-dc78-4b2f-801b-d965eeae5c8b/volumes" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317043 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq"] Mar 18 12:26:12 crc kubenswrapper[4921]: E0318 12:26:12.317567 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4029d08d-77da-4e73-a89d-134ed55ea00b" containerName="oc" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317580 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4029d08d-77da-4e73-a89d-134ed55ea00b" containerName="oc" Mar 18 12:26:12 crc kubenswrapper[4921]: E0318 12:26:12.317591 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="pull" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317597 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="pull" Mar 18 12:26:12 crc kubenswrapper[4921]: E0318 12:26:12.317608 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="extract" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317615 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="extract" Mar 18 12:26:12 crc kubenswrapper[4921]: E0318 12:26:12.317627 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="util" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317635 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="util" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317744 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4d9566-78f0-4989-be26-efb5416bcbac" containerName="extract" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.317775 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4029d08d-77da-4e73-a89d-134ed55ea00b" containerName="oc" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.318197 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.323154 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.323338 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.323597 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.324792 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dh8cs" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.324972 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.338528 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq"] Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.391255 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpvq6\" (UniqueName: \"kubernetes.io/projected/67daa5a5-ba64-4de4-95c8-d3217b539a80-kube-api-access-jpvq6\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.391655 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67daa5a5-ba64-4de4-95c8-d3217b539a80-webhook-cert\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.391709 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67daa5a5-ba64-4de4-95c8-d3217b539a80-apiservice-cert\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.492756 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67daa5a5-ba64-4de4-95c8-d3217b539a80-apiservice-cert\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.492808 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67daa5a5-ba64-4de4-95c8-d3217b539a80-webhook-cert\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.492857 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpvq6\" (UniqueName: \"kubernetes.io/projected/67daa5a5-ba64-4de4-95c8-d3217b539a80-kube-api-access-jpvq6\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.504143 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67daa5a5-ba64-4de4-95c8-d3217b539a80-apiservice-cert\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.516894 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67daa5a5-ba64-4de4-95c8-d3217b539a80-webhook-cert\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.521305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpvq6\" (UniqueName: \"kubernetes.io/projected/67daa5a5-ba64-4de4-95c8-d3217b539a80-kube-api-access-jpvq6\") pod \"metallb-operator-controller-manager-7b768b9d77-lrsfq\" (UID: \"67daa5a5-ba64-4de4-95c8-d3217b539a80\") " pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.620749 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4"] Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.621618 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.627833 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.627861 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.627901 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2wgfh" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.633068 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.699879 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfll\" (UniqueName: \"kubernetes.io/projected/2af7d7fb-43a0-4481-ad60-17d1448df801-kube-api-access-4kfll\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.700351 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2af7d7fb-43a0-4481-ad60-17d1448df801-apiservice-cert\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.701368 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2af7d7fb-43a0-4481-ad60-17d1448df801-webhook-cert\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.712616 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4"] Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.802390 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfll\" (UniqueName: \"kubernetes.io/projected/2af7d7fb-43a0-4481-ad60-17d1448df801-kube-api-access-4kfll\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.802465 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2af7d7fb-43a0-4481-ad60-17d1448df801-apiservice-cert\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.802509 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2af7d7fb-43a0-4481-ad60-17d1448df801-webhook-cert\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.808610 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2af7d7fb-43a0-4481-ad60-17d1448df801-apiservice-cert\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.820937 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2af7d7fb-43a0-4481-ad60-17d1448df801-webhook-cert\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.822700 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfll\" (UniqueName: \"kubernetes.io/projected/2af7d7fb-43a0-4481-ad60-17d1448df801-kube-api-access-4kfll\") pod \"metallb-operator-webhook-server-584c55bff8-7t5c4\" (UID: \"2af7d7fb-43a0-4481-ad60-17d1448df801\") " pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.951778 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:12 crc kubenswrapper[4921]: I0318 12:26:12.958629 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq"] Mar 18 12:26:13 crc kubenswrapper[4921]: I0318 12:26:13.186215 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4"] Mar 18 12:26:13 crc kubenswrapper[4921]: W0318 12:26:13.195020 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af7d7fb_43a0_4481_ad60_17d1448df801.slice/crio-319ee476157fa447f31252efa5c51e3c9f75f21bd346a6f226be0cdb63d59bbb WatchSource:0}: Error finding container 319ee476157fa447f31252efa5c51e3c9f75f21bd346a6f226be0cdb63d59bbb: Status 404 returned error can't find the container with id 319ee476157fa447f31252efa5c51e3c9f75f21bd346a6f226be0cdb63d59bbb Mar 18 12:26:13 crc kubenswrapper[4921]: I0318 12:26:13.707183 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" event={"ID":"67daa5a5-ba64-4de4-95c8-d3217b539a80","Type":"ContainerStarted","Data":"d326055be3cf42ab0d3f260172ef47894515f24dfb3b71f8b9f13801322994a0"} Mar 18 12:26:13 crc kubenswrapper[4921]: I0318 12:26:13.708875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" event={"ID":"2af7d7fb-43a0-4481-ad60-17d1448df801","Type":"ContainerStarted","Data":"319ee476157fa447f31252efa5c51e3c9f75f21bd346a6f226be0cdb63d59bbb"} Mar 18 12:26:17 crc kubenswrapper[4921]: I0318 12:26:17.080886 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:26:17 crc kubenswrapper[4921]: I0318 12:26:17.081273 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:19 crc kubenswrapper[4921]: I0318 12:26:19.743444 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" event={"ID":"2af7d7fb-43a0-4481-ad60-17d1448df801","Type":"ContainerStarted","Data":"20a64eb173273f474130a8ae622c1a7e4b309e87c24d550d3deeddd01dcfb4ac"} Mar 18 12:26:19 crc kubenswrapper[4921]: I0318 12:26:19.744139 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:19 crc kubenswrapper[4921]: I0318 12:26:19.745856 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" event={"ID":"67daa5a5-ba64-4de4-95c8-d3217b539a80","Type":"ContainerStarted","Data":"08135d1fb283d84294f25f74584cba6eb1acfb1dd258f4c3be4110fed32552fe"} Mar 18 12:26:19 crc kubenswrapper[4921]: I0318 12:26:19.746064 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:19 crc kubenswrapper[4921]: I0318 12:26:19.765730 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" podStartSLOduration=2.240879233 podStartE2EDuration="7.765709117s" podCreationTimestamp="2026-03-18 12:26:12 +0000 UTC" firstStartedPulling="2026-03-18 12:26:13.196518486 +0000 UTC m=+992.746439125" lastFinishedPulling="2026-03-18 12:26:18.72134837 +0000 UTC m=+998.271269009" observedRunningTime="2026-03-18 12:26:19.760326404 +0000 UTC m=+999.310247043" watchObservedRunningTime="2026-03-18 12:26:19.765709117 +0000 UTC m=+999.315629766" Mar 18 12:26:19 crc kubenswrapper[4921]: I0318 12:26:19.785124 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" podStartSLOduration=2.034404446 podStartE2EDuration="7.785082675s" podCreationTimestamp="2026-03-18 12:26:12 +0000 UTC" firstStartedPulling="2026-03-18 12:26:12.961930533 +0000 UTC m=+992.511851172" lastFinishedPulling="2026-03-18 12:26:18.712608762 +0000 UTC m=+998.262529401" observedRunningTime="2026-03-18 12:26:19.778866729 +0000 UTC m=+999.328787398" watchObservedRunningTime="2026-03-18 12:26:19.785082675 +0000 UTC m=+999.335003324" Mar 18 12:26:32 crc kubenswrapper[4921]: I0318 12:26:32.958638 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-584c55bff8-7t5c4" Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.082163 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.082806 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.082857 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.083506 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebcc7bf1aa6f60def18576e51eaa04202bf67a3ba2c684f5b12ee3391d160ae7"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.083574 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://ebcc7bf1aa6f60def18576e51eaa04202bf67a3ba2c684f5b12ee3391d160ae7" gracePeriod=600 Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.925711 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="ebcc7bf1aa6f60def18576e51eaa04202bf67a3ba2c684f5b12ee3391d160ae7" exitCode=0 Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.925792 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"ebcc7bf1aa6f60def18576e51eaa04202bf67a3ba2c684f5b12ee3391d160ae7"} Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.926086 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"b0b15d604734e663af7f5ab441b134e0458c09c7238f9cd112cf51b089408bef"} Mar 18 12:26:47 crc kubenswrapper[4921]: I0318 12:26:47.926154 4921 scope.go:117] "RemoveContainer" containerID="766d7dfa17c3a2f9b917da54f565e4c9feb5034e40822c9f54b662ea59c3b6dc" Mar 18 12:26:48 crc kubenswrapper[4921]: I0318 12:26:48.343705 4921 scope.go:117] "RemoveContainer" containerID="00488fa9a37401bc2a8a43ebf8258eb57a6a8ff740d85bf5dffb3349df358a78" Mar 18 12:26:52 crc kubenswrapper[4921]: I0318 12:26:52.637597 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b768b9d77-lrsfq" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.334829 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-b2vvr"] Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.337413 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.339705 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zp8k5" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.339956 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.340341 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.340869 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr"] Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.341848 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.347612 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.356319 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr"] Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.423785 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7rj4p"] Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.424780 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.427105 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.427246 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.427191 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hrr2d" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.430411 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.457658 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-vqkhh"] Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.457869 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-sockets\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.457988 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ck8b\" (UniqueName: \"kubernetes.io/projected/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-kube-api-access-4ck8b\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/215630e2-ecef-4272-889c-f4ef039f4eab-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-nbvzr\" (UID: \"215630e2-ecef-4272-889c-f4ef039f4eab\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458158 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-startup\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458235 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-metrics\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458298 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-conf\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458370 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-reloader\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458442 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-metrics-certs\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.458514 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29q42\" (UniqueName: \"kubernetes.io/projected/215630e2-ecef-4272-889c-f4ef039f4eab-kube-api-access-29q42\") pod \"frr-k8s-webhook-server-bcc4b6f68-nbvzr\" (UID: \"215630e2-ecef-4272-889c-f4ef039f4eab\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.460226 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.463962 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.467657 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-vqkhh"] Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.559969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560028 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489sz\" (UniqueName: \"kubernetes.io/projected/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-kube-api-access-489sz\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560060 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-metrics-certs\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560081 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-metrics-certs\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560169 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-sockets\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560198 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvq4z\" (UniqueName: \"kubernetes.io/projected/9f3127ca-083e-4013-8b41-1981194a7624-kube-api-access-jvq4z\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-cert\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560246 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ck8b\" (UniqueName: \"kubernetes.io/projected/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-kube-api-access-4ck8b\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560265 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/215630e2-ecef-4272-889c-f4ef039f4eab-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-nbvzr\" (UID: \"215630e2-ecef-4272-889c-f4ef039f4eab\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-startup\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560308 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-metrics\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560331 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-conf\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560354 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-reloader\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560375 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-metallb-excludel2\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560396 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-metrics-certs\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.560419 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29q42\" (UniqueName: \"kubernetes.io/projected/215630e2-ecef-4272-889c-f4ef039f4eab-kube-api-access-29q42\") pod \"frr-k8s-webhook-server-bcc4b6f68-nbvzr\" (UID: \"215630e2-ecef-4272-889c-f4ef039f4eab\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.561235 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-sockets\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.562785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-conf\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.562818 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-reloader\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.562945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-metrics\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.563932 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-frr-startup\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.574887 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-metrics-certs\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.575504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/215630e2-ecef-4272-889c-f4ef039f4eab-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-nbvzr\" (UID: \"215630e2-ecef-4272-889c-f4ef039f4eab\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.579093 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29q42\" (UniqueName: \"kubernetes.io/projected/215630e2-ecef-4272-889c-f4ef039f4eab-kube-api-access-29q42\") pod \"frr-k8s-webhook-server-bcc4b6f68-nbvzr\" (UID: \"215630e2-ecef-4272-889c-f4ef039f4eab\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.581424 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ck8b\" (UniqueName: \"kubernetes.io/projected/53ad344e-7f2e-4dae-ab48-0c2eb650c6e6-kube-api-access-4ck8b\") pod \"frr-k8s-b2vvr\" (UID: \"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6\") " pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.657687 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661012 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvq4z\" (UniqueName: \"kubernetes.io/projected/9f3127ca-083e-4013-8b41-1981194a7624-kube-api-access-jvq4z\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661051 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-cert\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661090 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-metallb-excludel2\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661155 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661182 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-489sz\" (UniqueName: \"kubernetes.io/projected/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-kube-api-access-489sz\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661207 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-metrics-certs\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.661230 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-metrics-certs\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: E0318 12:26:53.661376 4921 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 18 12:26:53 crc kubenswrapper[4921]: E0318 12:26:53.661428 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-metrics-certs podName:9f3127ca-083e-4013-8b41-1981194a7624 nodeName:}" failed. No retries permitted until 2026-03-18 12:26:54.161408354 +0000 UTC m=+1033.711328993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-metrics-certs") pod "controller-7bb4cc7c98-vqkhh" (UID: "9f3127ca-083e-4013-8b41-1981194a7624") : secret "controller-certs-secret" not found Mar 18 12:26:53 crc kubenswrapper[4921]: E0318 12:26:53.661472 4921 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 12:26:53 crc kubenswrapper[4921]: E0318 12:26:53.661519 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist podName:d09ed9aa-00f0-4f1b-b259-c2dca6943b9d nodeName:}" failed. No retries permitted until 2026-03-18 12:26:54.161505067 +0000 UTC m=+1033.711425706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist") pod "speaker-7rj4p" (UID: "d09ed9aa-00f0-4f1b-b259-c2dca6943b9d") : secret "metallb-memberlist" not found Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.662189 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-metallb-excludel2\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.663176 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.666698 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-metrics-certs\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.669960 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.681881 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-cert\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.691668 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-489sz\" (UniqueName: \"kubernetes.io/projected/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-kube-api-access-489sz\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.693688 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvq4z\" (UniqueName: \"kubernetes.io/projected/9f3127ca-083e-4013-8b41-1981194a7624-kube-api-access-jvq4z\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.918261 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr"] Mar 18 12:26:53 crc kubenswrapper[4921]: W0318 12:26:53.920140 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod215630e2_ecef_4272_889c_f4ef039f4eab.slice/crio-5904adccb5990504db51da6e950328e3768a0939973a852e18a29276d6d0460f WatchSource:0}: Error finding container 5904adccb5990504db51da6e950328e3768a0939973a852e18a29276d6d0460f: Status 404 returned error can't find the container with id 5904adccb5990504db51da6e950328e3768a0939973a852e18a29276d6d0460f Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.967813 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" event={"ID":"215630e2-ecef-4272-889c-f4ef039f4eab","Type":"ContainerStarted","Data":"5904adccb5990504db51da6e950328e3768a0939973a852e18a29276d6d0460f"} Mar 18 12:26:53 crc kubenswrapper[4921]: I0318 12:26:53.968578 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"cf23e46117bf1392e38988256a249138ec634220bacd43791ff74f3f284daf86"} Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.168175 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.168228 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-metrics-certs\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:54 crc kubenswrapper[4921]: E0318 12:26:54.168336 4921 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 12:26:54 crc kubenswrapper[4921]: E0318 12:26:54.168402 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist podName:d09ed9aa-00f0-4f1b-b259-c2dca6943b9d nodeName:}" failed. No retries permitted until 2026-03-18 12:26:55.168386071 +0000 UTC m=+1034.718306710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist") pod "speaker-7rj4p" (UID: "d09ed9aa-00f0-4f1b-b259-c2dca6943b9d") : secret "metallb-memberlist" not found Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.174319 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f3127ca-083e-4013-8b41-1981194a7624-metrics-certs\") pod \"controller-7bb4cc7c98-vqkhh\" (UID: \"9f3127ca-083e-4013-8b41-1981194a7624\") " pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.380060 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.777452 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-vqkhh"] Mar 18 12:26:54 crc kubenswrapper[4921]: W0318 12:26:54.783597 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3127ca_083e_4013_8b41_1981194a7624.slice/crio-bcf64d1a76535d1264b5e11503f613862a743a4c9e98460de44e10fcf61ae5b9 WatchSource:0}: Error finding container bcf64d1a76535d1264b5e11503f613862a743a4c9e98460de44e10fcf61ae5b9: Status 404 returned error can't find the container with id bcf64d1a76535d1264b5e11503f613862a743a4c9e98460de44e10fcf61ae5b9 Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.993403 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vqkhh" event={"ID":"9f3127ca-083e-4013-8b41-1981194a7624","Type":"ContainerStarted","Data":"4cfc188e0313b1918c6233344f87f1c7d35b0cb37b2efe3eb1a39d0646f96a52"} Mar 18 12:26:54 crc kubenswrapper[4921]: I0318 12:26:54.993448 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vqkhh" event={"ID":"9f3127ca-083e-4013-8b41-1981194a7624","Type":"ContainerStarted","Data":"bcf64d1a76535d1264b5e11503f613862a743a4c9e98460de44e10fcf61ae5b9"} Mar 18 12:26:55 crc kubenswrapper[4921]: I0318 12:26:55.188762 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:55 crc kubenswrapper[4921]: I0318 12:26:55.193578 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d09ed9aa-00f0-4f1b-b259-c2dca6943b9d-memberlist\") pod \"speaker-7rj4p\" (UID: \"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d\") " pod="metallb-system/speaker-7rj4p" Mar 18 12:26:55 crc kubenswrapper[4921]: I0318 12:26:55.243756 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7rj4p" Mar 18 12:26:55 crc kubenswrapper[4921]: W0318 12:26:55.268370 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd09ed9aa_00f0_4f1b_b259_c2dca6943b9d.slice/crio-14544ded94cc95bdc1deaba01335db8cd0146cea2bfa8c0da6007581864cfb75 WatchSource:0}: Error finding container 14544ded94cc95bdc1deaba01335db8cd0146cea2bfa8c0da6007581864cfb75: Status 404 returned error can't find the container with id 14544ded94cc95bdc1deaba01335db8cd0146cea2bfa8c0da6007581864cfb75 Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.001592 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-vqkhh" event={"ID":"9f3127ca-083e-4013-8b41-1981194a7624","Type":"ContainerStarted","Data":"933b1cce23f4e56da7b68bf513f6b135df4a8c062cef6a8c16f12f9f1b21a476"} Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.001947 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.003992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7rj4p" event={"ID":"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d","Type":"ContainerStarted","Data":"8a3ae8f61f2293a7d7b061e51783ff1cf6f54d877ceee83468affdd50204d257"} Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.004030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7rj4p" event={"ID":"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d","Type":"ContainerStarted","Data":"88c98ff3823ccf58aa90008431a43a32c4f3ff260462d1058eecd6d61fa31350"} Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.004047 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7rj4p" event={"ID":"d09ed9aa-00f0-4f1b-b259-c2dca6943b9d","Type":"ContainerStarted","Data":"14544ded94cc95bdc1deaba01335db8cd0146cea2bfa8c0da6007581864cfb75"} Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.004241 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7rj4p" Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.020155 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-vqkhh" podStartSLOduration=3.020130843 podStartE2EDuration="3.020130843s" podCreationTimestamp="2026-03-18 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:26:56.015448471 +0000 UTC m=+1035.565369130" watchObservedRunningTime="2026-03-18 12:26:56.020130843 +0000 UTC m=+1035.570051482" Mar 18 12:26:56 crc kubenswrapper[4921]: I0318 12:26:56.036577 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7rj4p" podStartSLOduration=3.036556578 podStartE2EDuration="3.036556578s" podCreationTimestamp="2026-03-18 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:26:56.033573484 +0000 UTC m=+1035.583494123" watchObservedRunningTime="2026-03-18 12:26:56.036556578 +0000 UTC m=+1035.586477217" Mar 18 12:27:02 crc kubenswrapper[4921]: I0318 12:27:02.050909 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" event={"ID":"215630e2-ecef-4272-889c-f4ef039f4eab","Type":"ContainerStarted","Data":"d05a755b0a8382ca7a30ace02d6ff74b8f63f272ef8e964b893ba5791db6a4c6"} Mar 18 12:27:02 crc kubenswrapper[4921]: I0318 12:27:02.051496 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:27:02 crc kubenswrapper[4921]: I0318 12:27:02.055037 4921 generic.go:334] "Generic (PLEG): container finished" podID="53ad344e-7f2e-4dae-ab48-0c2eb650c6e6" containerID="e442dd95c04d2322d34475cd90896a90ef196504c124437dd31907ce382c4bd3" exitCode=0 Mar 18 12:27:02 crc kubenswrapper[4921]: I0318 12:27:02.055086 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerDied","Data":"e442dd95c04d2322d34475cd90896a90ef196504c124437dd31907ce382c4bd3"} Mar 18 12:27:02 crc kubenswrapper[4921]: I0318 12:27:02.071567 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" podStartSLOduration=1.62877673 podStartE2EDuration="9.07154694s" podCreationTimestamp="2026-03-18 12:26:53 +0000 UTC" firstStartedPulling="2026-03-18 12:26:53.922853298 +0000 UTC m=+1033.472773937" lastFinishedPulling="2026-03-18 12:27:01.365623508 +0000 UTC m=+1040.915544147" observedRunningTime="2026-03-18 12:27:02.066712543 +0000 UTC m=+1041.616633182" watchObservedRunningTime="2026-03-18 12:27:02.07154694 +0000 UTC m=+1041.621467579" Mar 18 12:27:03 crc kubenswrapper[4921]: I0318 12:27:03.063234 4921 generic.go:334] "Generic (PLEG): container finished" podID="53ad344e-7f2e-4dae-ab48-0c2eb650c6e6" containerID="147c88fef83ae576163a63ca7f9a6ee91f0673d6170cf0cbb16d154ecf04f5a7" exitCode=0 Mar 18 12:27:03 crc kubenswrapper[4921]: I0318 12:27:03.063317 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerDied","Data":"147c88fef83ae576163a63ca7f9a6ee91f0673d6170cf0cbb16d154ecf04f5a7"} Mar 18 12:27:04 crc kubenswrapper[4921]: I0318 12:27:04.078880 4921 generic.go:334] "Generic (PLEG): container finished" podID="53ad344e-7f2e-4dae-ab48-0c2eb650c6e6" containerID="459bcb571e415ffe7c116e5a01045b065397aaec649889c82b44fe623504886a" exitCode=0 Mar 18 12:27:04 crc kubenswrapper[4921]: I0318 12:27:04.078979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerDied","Data":"459bcb571e415ffe7c116e5a01045b065397aaec649889c82b44fe623504886a"} Mar 18 12:27:04 crc kubenswrapper[4921]: I0318 12:27:04.383767 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-vqkhh" Mar 18 12:27:05 crc kubenswrapper[4921]: I0318 12:27:05.115697 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"7fbfbc57e522d64f93c905d639960077c3e28881902bcebe1ed0bafb307fca94"} Mar 18 12:27:05 crc kubenswrapper[4921]: I0318 12:27:05.115757 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"9393276e50f58c7a6035140bc07c8873505bd0fb54651aef397bf98b18cd34fd"} Mar 18 12:27:05 crc kubenswrapper[4921]: I0318 12:27:05.115777 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"938b0cada458d1f0684b9ac888d0fd79c140fedac9fa7086e8782fc1d2728e71"} Mar 18 12:27:05 crc kubenswrapper[4921]: I0318 12:27:05.115794 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"d0eac9592785dce4846d32a3c28d98d580357087b05e2053ab06c05929136f27"} Mar 18 12:27:05 crc kubenswrapper[4921]: I0318 12:27:05.115812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"074fcde49a92d1349d07d87b719cf3e079eef1399efef6f76d83da59076bee8c"} Mar 18 12:27:05 crc kubenswrapper[4921]: I0318 12:27:05.246683 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7rj4p" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.126600 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b2vvr" event={"ID":"53ad344e-7f2e-4dae-ab48-0c2eb650c6e6","Type":"ContainerStarted","Data":"66cbe59f2692581f867935afeb917d056dbcfa90a11279f5540323037631ca8d"} Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.127979 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.153335 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-b2vvr" podStartSLOduration=5.663795573 podStartE2EDuration="13.153315166s" podCreationTimestamp="2026-03-18 12:26:53 +0000 UTC" firstStartedPulling="2026-03-18 12:26:53.861276025 +0000 UTC m=+1033.411196664" lastFinishedPulling="2026-03-18 12:27:01.350795618 +0000 UTC m=+1040.900716257" observedRunningTime="2026-03-18 12:27:06.15204024 +0000 UTC m=+1045.701960889" watchObservedRunningTime="2026-03-18 12:27:06.153315166 +0000 UTC m=+1045.703235805" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.537440 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c"] Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.538768 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.541536 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.547869 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.547955 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.547981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qhc\" (UniqueName: \"kubernetes.io/projected/139d771c-f615-472e-9ac9-ce64e4f543bf-kube-api-access-28qhc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.568323 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c"] Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.648740 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.649146 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qhc\" (UniqueName: \"kubernetes.io/projected/139d771c-f615-472e-9ac9-ce64e4f543bf-kube-api-access-28qhc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.649215 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.649525 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.649706 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.672003 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qhc\" (UniqueName: \"kubernetes.io/projected/139d771c-f615-472e-9ac9-ce64e4f543bf-kube-api-access-28qhc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:06 crc kubenswrapper[4921]: I0318 12:27:06.854992 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:07 crc kubenswrapper[4921]: I0318 12:27:07.093515 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c"] Mar 18 12:27:07 crc kubenswrapper[4921]: I0318 12:27:07.136130 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" event={"ID":"139d771c-f615-472e-9ac9-ce64e4f543bf","Type":"ContainerStarted","Data":"1c0eb7129359b27225761051026f56f29df6261e0a4bf99453eca26c745ebf6d"} Mar 18 12:27:08 crc kubenswrapper[4921]: I0318 12:27:08.142488 4921 generic.go:334] "Generic (PLEG): container finished" podID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerID="e9aa48c5381d7f9dbcd029093187c93d470f6471b1b453f398e26fcc39f17334" exitCode=0 Mar 18 12:27:08 crc kubenswrapper[4921]: I0318 12:27:08.142557 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" event={"ID":"139d771c-f615-472e-9ac9-ce64e4f543bf","Type":"ContainerDied","Data":"e9aa48c5381d7f9dbcd029093187c93d470f6471b1b453f398e26fcc39f17334"} Mar 18 12:27:08 crc kubenswrapper[4921]: I0318 12:27:08.658911 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:27:08 crc kubenswrapper[4921]: I0318 12:27:08.724172 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:27:12 crc kubenswrapper[4921]: I0318 12:27:12.171188 4921 generic.go:334] "Generic (PLEG): container finished" podID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerID="c916572fa4513b6a135053c25202216d72b353d687ff02bcab2a3e8ee825f06e" exitCode=0 Mar 18 12:27:12 crc kubenswrapper[4921]: I0318 12:27:12.171302 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" event={"ID":"139d771c-f615-472e-9ac9-ce64e4f543bf","Type":"ContainerDied","Data":"c916572fa4513b6a135053c25202216d72b353d687ff02bcab2a3e8ee825f06e"} Mar 18 12:27:13 crc kubenswrapper[4921]: I0318 12:27:13.178918 4921 generic.go:334] "Generic (PLEG): container finished" podID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerID="45cfe53068fc25a20f99998e17ac82b8701ed233f5157cda2594d9adef26759f" exitCode=0 Mar 18 12:27:13 crc kubenswrapper[4921]: I0318 12:27:13.178957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" event={"ID":"139d771c-f615-472e-9ac9-ce64e4f543bf","Type":"ContainerDied","Data":"45cfe53068fc25a20f99998e17ac82b8701ed233f5157cda2594d9adef26759f"} Mar 18 12:27:13 crc kubenswrapper[4921]: I0318 12:27:13.676593 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-nbvzr" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.529205 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.561937 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-util\") pod \"139d771c-f615-472e-9ac9-ce64e4f543bf\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.562057 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-bundle\") pod \"139d771c-f615-472e-9ac9-ce64e4f543bf\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.562124 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28qhc\" (UniqueName: \"kubernetes.io/projected/139d771c-f615-472e-9ac9-ce64e4f543bf-kube-api-access-28qhc\") pod \"139d771c-f615-472e-9ac9-ce64e4f543bf\" (UID: \"139d771c-f615-472e-9ac9-ce64e4f543bf\") " Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.564219 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-bundle" (OuterVolumeSpecName: "bundle") pod "139d771c-f615-472e-9ac9-ce64e4f543bf" (UID: "139d771c-f615-472e-9ac9-ce64e4f543bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.570295 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139d771c-f615-472e-9ac9-ce64e4f543bf-kube-api-access-28qhc" (OuterVolumeSpecName: "kube-api-access-28qhc") pod "139d771c-f615-472e-9ac9-ce64e4f543bf" (UID: "139d771c-f615-472e-9ac9-ce64e4f543bf"). InnerVolumeSpecName "kube-api-access-28qhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.572535 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-util" (OuterVolumeSpecName: "util") pod "139d771c-f615-472e-9ac9-ce64e4f543bf" (UID: "139d771c-f615-472e-9ac9-ce64e4f543bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.663511 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.663609 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28qhc\" (UniqueName: \"kubernetes.io/projected/139d771c-f615-472e-9ac9-ce64e4f543bf-kube-api-access-28qhc\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:14 crc kubenswrapper[4921]: I0318 12:27:14.663628 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139d771c-f615-472e-9ac9-ce64e4f543bf-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:15 crc kubenswrapper[4921]: I0318 12:27:15.197163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" event={"ID":"139d771c-f615-472e-9ac9-ce64e4f543bf","Type":"ContainerDied","Data":"1c0eb7129359b27225761051026f56f29df6261e0a4bf99453eca26c745ebf6d"} Mar 18 12:27:15 crc kubenswrapper[4921]: I0318 12:27:15.197490 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0eb7129359b27225761051026f56f29df6261e0a4bf99453eca26c745ebf6d" Mar 18 12:27:15 crc kubenswrapper[4921]: I0318 12:27:15.197284 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.852963 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp"] Mar 18 12:27:19 crc kubenswrapper[4921]: E0318 12:27:19.853510 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="util" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.853525 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="util" Mar 18 12:27:19 crc kubenswrapper[4921]: E0318 12:27:19.853539 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="extract" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.853544 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="extract" Mar 18 12:27:19 crc kubenswrapper[4921]: E0318 12:27:19.853554 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="pull" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.853560 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="pull" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.853671 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="139d771c-f615-472e-9ac9-ce64e4f543bf" containerName="extract" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.854097 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.855925 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.856305 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.860338 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-fwls8" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.875448 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp"] Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.935834 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55e367fe-d816-4546-9d62-8624c48ec9c4-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-dwpjp\" (UID: \"55e367fe-d816-4546-9d62-8624c48ec9c4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:19 crc kubenswrapper[4921]: I0318 12:27:19.935914 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87pl5\" (UniqueName: \"kubernetes.io/projected/55e367fe-d816-4546-9d62-8624c48ec9c4-kube-api-access-87pl5\") pod \"cert-manager-operator-controller-manager-66c8bdd694-dwpjp\" (UID: \"55e367fe-d816-4546-9d62-8624c48ec9c4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:20 crc kubenswrapper[4921]: I0318 12:27:20.037037 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55e367fe-d816-4546-9d62-8624c48ec9c4-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-dwpjp\" (UID: \"55e367fe-d816-4546-9d62-8624c48ec9c4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:20 crc kubenswrapper[4921]: I0318 12:27:20.037126 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87pl5\" (UniqueName: \"kubernetes.io/projected/55e367fe-d816-4546-9d62-8624c48ec9c4-kube-api-access-87pl5\") pod \"cert-manager-operator-controller-manager-66c8bdd694-dwpjp\" (UID: \"55e367fe-d816-4546-9d62-8624c48ec9c4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:20 crc kubenswrapper[4921]: I0318 12:27:20.037776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55e367fe-d816-4546-9d62-8624c48ec9c4-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-dwpjp\" (UID: \"55e367fe-d816-4546-9d62-8624c48ec9c4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:20 crc kubenswrapper[4921]: I0318 12:27:20.107211 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87pl5\" (UniqueName: \"kubernetes.io/projected/55e367fe-d816-4546-9d62-8624c48ec9c4-kube-api-access-87pl5\") pod \"cert-manager-operator-controller-manager-66c8bdd694-dwpjp\" (UID: \"55e367fe-d816-4546-9d62-8624c48ec9c4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:20 crc kubenswrapper[4921]: I0318 12:27:20.170189 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" Mar 18 12:27:20 crc kubenswrapper[4921]: I0318 12:27:20.653510 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp"] Mar 18 12:27:21 crc kubenswrapper[4921]: I0318 12:27:21.232778 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" event={"ID":"55e367fe-d816-4546-9d62-8624c48ec9c4","Type":"ContainerStarted","Data":"20d0bb6e2524655abaae10e03401366cfe31bed9366453e8cbbcacdffb0b9041"} Mar 18 12:27:23 crc kubenswrapper[4921]: I0318 12:27:23.660497 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-b2vvr" Mar 18 12:27:24 crc kubenswrapper[4921]: I0318 12:27:24.254953 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" event={"ID":"55e367fe-d816-4546-9d62-8624c48ec9c4","Type":"ContainerStarted","Data":"3e12af8d7501beaaa8f6f1af8a82b63e7c0498fc50acb93cd45af0431f3ff41a"} Mar 18 12:27:24 crc kubenswrapper[4921]: I0318 12:27:24.272068 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-dwpjp" podStartSLOduration=2.349634277 podStartE2EDuration="5.27205018s" podCreationTimestamp="2026-03-18 12:27:19 +0000 UTC" firstStartedPulling="2026-03-18 12:27:20.659656837 +0000 UTC m=+1060.209577486" lastFinishedPulling="2026-03-18 12:27:23.58207275 +0000 UTC m=+1063.131993389" observedRunningTime="2026-03-18 12:27:24.269710814 +0000 UTC m=+1063.819631453" watchObservedRunningTime="2026-03-18 12:27:24.27205018 +0000 UTC m=+1063.821970819" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.519572 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmjjj"] Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.521284 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.523931 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8npck" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.524273 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.524362 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.533633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rld4t\" (UniqueName: \"kubernetes.io/projected/aca422c3-e9e3-435e-8b17-8da14882eaae-kube-api-access-rld4t\") pod \"cert-manager-webhook-6888856db4-qmjjj\" (UID: \"aca422c3-e9e3-435e-8b17-8da14882eaae\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.533697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca422c3-e9e3-435e-8b17-8da14882eaae-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmjjj\" (UID: \"aca422c3-e9e3-435e-8b17-8da14882eaae\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.538759 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmjjj"] Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.635219 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca422c3-e9e3-435e-8b17-8da14882eaae-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmjjj\" (UID: \"aca422c3-e9e3-435e-8b17-8da14882eaae\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.635348 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rld4t\" (UniqueName: \"kubernetes.io/projected/aca422c3-e9e3-435e-8b17-8da14882eaae-kube-api-access-rld4t\") pod \"cert-manager-webhook-6888856db4-qmjjj\" (UID: \"aca422c3-e9e3-435e-8b17-8da14882eaae\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.654855 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aca422c3-e9e3-435e-8b17-8da14882eaae-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qmjjj\" (UID: \"aca422c3-e9e3-435e-8b17-8da14882eaae\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.669365 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rld4t\" (UniqueName: \"kubernetes.io/projected/aca422c3-e9e3-435e-8b17-8da14882eaae-kube-api-access-rld4t\") pod \"cert-manager-webhook-6888856db4-qmjjj\" (UID: \"aca422c3-e9e3-435e-8b17-8da14882eaae\") " pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:26 crc kubenswrapper[4921]: I0318 12:27:26.835655 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:27 crc kubenswrapper[4921]: I0318 12:27:27.088544 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qmjjj"] Mar 18 12:27:27 crc kubenswrapper[4921]: I0318 12:27:27.282231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" event={"ID":"aca422c3-e9e3-435e-8b17-8da14882eaae","Type":"ContainerStarted","Data":"a6ca3138090f8fe069c000ab233b9959e50093d5bb8e12660f89a5c335fdafd0"} Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.649614 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bzfzt"] Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.655368 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.657706 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2lg5r" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.678429 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bzfzt"] Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.802360 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a64439b0-8ac8-43aa-addd-e6cdb72bf3f4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bzfzt\" (UID: \"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.802499 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkz8t\" (UniqueName: \"kubernetes.io/projected/a64439b0-8ac8-43aa-addd-e6cdb72bf3f4-kube-api-access-rkz8t\") pod \"cert-manager-cainjector-5545bd876-bzfzt\" (UID: \"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.904697 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a64439b0-8ac8-43aa-addd-e6cdb72bf3f4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bzfzt\" (UID: \"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.904830 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkz8t\" (UniqueName: \"kubernetes.io/projected/a64439b0-8ac8-43aa-addd-e6cdb72bf3f4-kube-api-access-rkz8t\") pod \"cert-manager-cainjector-5545bd876-bzfzt\" (UID: \"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.928460 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkz8t\" (UniqueName: \"kubernetes.io/projected/a64439b0-8ac8-43aa-addd-e6cdb72bf3f4-kube-api-access-rkz8t\") pod \"cert-manager-cainjector-5545bd876-bzfzt\" (UID: \"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.933441 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a64439b0-8ac8-43aa-addd-e6cdb72bf3f4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bzfzt\" (UID: \"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:29 crc kubenswrapper[4921]: I0318 12:27:29.981166 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" Mar 18 12:27:30 crc kubenswrapper[4921]: I0318 12:27:30.405354 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bzfzt"] Mar 18 12:27:31 crc kubenswrapper[4921]: W0318 12:27:31.870654 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64439b0_8ac8_43aa_addd_e6cdb72bf3f4.slice/crio-3ac7d3f58707869be983283251c72a59d6d86e225e9c0c4814efe06e6781ff5e WatchSource:0}: Error finding container 3ac7d3f58707869be983283251c72a59d6d86e225e9c0c4814efe06e6781ff5e: Status 404 returned error can't find the container with id 3ac7d3f58707869be983283251c72a59d6d86e225e9c0c4814efe06e6781ff5e Mar 18 12:27:32 crc kubenswrapper[4921]: I0318 12:27:32.322397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" event={"ID":"aca422c3-e9e3-435e-8b17-8da14882eaae","Type":"ContainerStarted","Data":"d1c8e4b30846228d24bffa523b35f68dbd5bedf07bdcc6070fb6ac7e9e8ff7c9"} Mar 18 12:27:32 crc kubenswrapper[4921]: I0318 12:27:32.322791 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:32 crc kubenswrapper[4921]: I0318 12:27:32.323790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" event={"ID":"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4","Type":"ContainerStarted","Data":"3ac7d3f58707869be983283251c72a59d6d86e225e9c0c4814efe06e6781ff5e"} Mar 18 12:27:32 crc kubenswrapper[4921]: I0318 12:27:32.337749 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" podStartSLOduration=1.496788346 podStartE2EDuration="6.337734002s" podCreationTimestamp="2026-03-18 12:27:26 +0000 UTC" firstStartedPulling="2026-03-18 12:27:27.10042368 +0000 UTC m=+1066.650344319" lastFinishedPulling="2026-03-18 12:27:31.941369336 +0000 UTC m=+1071.491289975" observedRunningTime="2026-03-18 12:27:32.337623598 +0000 UTC m=+1071.887544247" watchObservedRunningTime="2026-03-18 12:27:32.337734002 +0000 UTC m=+1071.887654641" Mar 18 12:27:33 crc kubenswrapper[4921]: I0318 12:27:33.332418 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" event={"ID":"a64439b0-8ac8-43aa-addd-e6cdb72bf3f4","Type":"ContainerStarted","Data":"0f9d4203a37edd1cc82b58336df539d9d1d9e93a5b9c05af9219e7361d44f92f"} Mar 18 12:27:33 crc kubenswrapper[4921]: I0318 12:27:33.347048 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-bzfzt" podStartSLOduration=3.210186448 podStartE2EDuration="4.347030264s" podCreationTimestamp="2026-03-18 12:27:29 +0000 UTC" firstStartedPulling="2026-03-18 12:27:31.873325079 +0000 UTC m=+1071.423245738" lastFinishedPulling="2026-03-18 12:27:33.010168915 +0000 UTC m=+1072.560089554" observedRunningTime="2026-03-18 12:27:33.344745109 +0000 UTC m=+1072.894665758" watchObservedRunningTime="2026-03-18 12:27:33.347030264 +0000 UTC m=+1072.896950903" Mar 18 12:27:41 crc kubenswrapper[4921]: I0318 12:27:41.839931 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-qmjjj" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.436749 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-gb4dc"] Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.438249 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.440959 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8m6pl" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.457504 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-gb4dc"] Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.526440 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjpbj\" (UniqueName: \"kubernetes.io/projected/0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9-kube-api-access-zjpbj\") pod \"cert-manager-545d4d4674-gb4dc\" (UID: \"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9\") " pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.526501 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9-bound-sa-token\") pod \"cert-manager-545d4d4674-gb4dc\" (UID: \"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9\") " pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.627733 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjpbj\" (UniqueName: \"kubernetes.io/projected/0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9-kube-api-access-zjpbj\") pod \"cert-manager-545d4d4674-gb4dc\" (UID: \"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9\") " pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.627808 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9-bound-sa-token\") pod \"cert-manager-545d4d4674-gb4dc\" (UID: \"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9\") " pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.646849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9-bound-sa-token\") pod \"cert-manager-545d4d4674-gb4dc\" (UID: \"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9\") " pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.649329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjpbj\" (UniqueName: \"kubernetes.io/projected/0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9-kube-api-access-zjpbj\") pod \"cert-manager-545d4d4674-gb4dc\" (UID: \"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9\") " pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.760375 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-gb4dc" Mar 18 12:27:45 crc kubenswrapper[4921]: I0318 12:27:45.998395 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-gb4dc"] Mar 18 12:27:46 crc kubenswrapper[4921]: I0318 12:27:46.418957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-gb4dc" event={"ID":"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9","Type":"ContainerStarted","Data":"9bbfbbf2b75bc4870ac97e08c86c3859d17add661cea18e7e52755c23be73918"} Mar 18 12:27:46 crc kubenswrapper[4921]: I0318 12:27:46.419339 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-gb4dc" event={"ID":"0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9","Type":"ContainerStarted","Data":"2deda38542523763659323588a84a6a69d1a33f95a393976e733c642f381df74"} Mar 18 12:27:46 crc kubenswrapper[4921]: I0318 12:27:46.437283 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-gb4dc" podStartSLOduration=1.43726284 podStartE2EDuration="1.43726284s" podCreationTimestamp="2026-03-18 12:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:27:46.433883875 +0000 UTC m=+1085.983804514" watchObservedRunningTime="2026-03-18 12:27:46.43726284 +0000 UTC m=+1085.987183479" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.105003 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zrdx7"] Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.106120 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.108049 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.108096 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-7dxzr" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.108245 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.177397 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zrdx7"] Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.259940 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c88ht\" (UniqueName: \"kubernetes.io/projected/7f0d3174-f5a3-4712-908e-00a4aa1a0169-kube-api-access-c88ht\") pod \"openstack-operator-index-zrdx7\" (UID: \"7f0d3174-f5a3-4712-908e-00a4aa1a0169\") " pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.362185 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c88ht\" (UniqueName: \"kubernetes.io/projected/7f0d3174-f5a3-4712-908e-00a4aa1a0169-kube-api-access-c88ht\") pod \"openstack-operator-index-zrdx7\" (UID: \"7f0d3174-f5a3-4712-908e-00a4aa1a0169\") " pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.382221 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c88ht\" (UniqueName: \"kubernetes.io/projected/7f0d3174-f5a3-4712-908e-00a4aa1a0169-kube-api-access-c88ht\") pod \"openstack-operator-index-zrdx7\" (UID: \"7f0d3174-f5a3-4712-908e-00a4aa1a0169\") " pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.424854 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:55 crc kubenswrapper[4921]: I0318 12:27:55.830610 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zrdx7"] Mar 18 12:27:56 crc kubenswrapper[4921]: I0318 12:27:56.492442 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zrdx7" event={"ID":"7f0d3174-f5a3-4712-908e-00a4aa1a0169","Type":"ContainerStarted","Data":"a04d61777a23a019aa47fec2b9021177449e3b4aca09d5ed04b3824f0693c2ae"} Mar 18 12:27:57 crc kubenswrapper[4921]: I0318 12:27:57.275944 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zrdx7"] Mar 18 12:27:57 crc kubenswrapper[4921]: I0318 12:27:57.887145 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-8cvjq"] Mar 18 12:27:57 crc kubenswrapper[4921]: I0318 12:27:57.887976 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:27:57 crc kubenswrapper[4921]: I0318 12:27:57.892292 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8cvjq"] Mar 18 12:27:57 crc kubenswrapper[4921]: I0318 12:27:57.902186 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxx8k\" (UniqueName: \"kubernetes.io/projected/9a5083f2-1437-4efc-b0ed-3e18a4bc8a81-kube-api-access-nxx8k\") pod \"openstack-operator-index-8cvjq\" (UID: \"9a5083f2-1437-4efc-b0ed-3e18a4bc8a81\") " pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.003603 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxx8k\" (UniqueName: \"kubernetes.io/projected/9a5083f2-1437-4efc-b0ed-3e18a4bc8a81-kube-api-access-nxx8k\") pod \"openstack-operator-index-8cvjq\" (UID: \"9a5083f2-1437-4efc-b0ed-3e18a4bc8a81\") " pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.031734 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxx8k\" (UniqueName: \"kubernetes.io/projected/9a5083f2-1437-4efc-b0ed-3e18a4bc8a81-kube-api-access-nxx8k\") pod \"openstack-operator-index-8cvjq\" (UID: \"9a5083f2-1437-4efc-b0ed-3e18a4bc8a81\") " pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.208307 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.509189 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zrdx7" event={"ID":"7f0d3174-f5a3-4712-908e-00a4aa1a0169","Type":"ContainerStarted","Data":"5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0"} Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.509372 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-zrdx7" podUID="7f0d3174-f5a3-4712-908e-00a4aa1a0169" containerName="registry-server" containerID="cri-o://5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0" gracePeriod=2 Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.537446 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zrdx7" podStartSLOduration=1.432493235 podStartE2EDuration="3.537418877s" podCreationTimestamp="2026-03-18 12:27:55 +0000 UTC" firstStartedPulling="2026-03-18 12:27:55.835709795 +0000 UTC m=+1095.385630444" lastFinishedPulling="2026-03-18 12:27:57.940635437 +0000 UTC m=+1097.490556086" observedRunningTime="2026-03-18 12:27:58.52976015 +0000 UTC m=+1098.079680829" watchObservedRunningTime="2026-03-18 12:27:58.537418877 +0000 UTC m=+1098.087339546" Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.607746 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8cvjq"] Mar 18 12:27:58 crc kubenswrapper[4921]: I0318 12:27:58.850397 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.017524 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c88ht\" (UniqueName: \"kubernetes.io/projected/7f0d3174-f5a3-4712-908e-00a4aa1a0169-kube-api-access-c88ht\") pod \"7f0d3174-f5a3-4712-908e-00a4aa1a0169\" (UID: \"7f0d3174-f5a3-4712-908e-00a4aa1a0169\") " Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.023008 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0d3174-f5a3-4712-908e-00a4aa1a0169-kube-api-access-c88ht" (OuterVolumeSpecName: "kube-api-access-c88ht") pod "7f0d3174-f5a3-4712-908e-00a4aa1a0169" (UID: "7f0d3174-f5a3-4712-908e-00a4aa1a0169"). InnerVolumeSpecName "kube-api-access-c88ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.118806 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c88ht\" (UniqueName: \"kubernetes.io/projected/7f0d3174-f5a3-4712-908e-00a4aa1a0169-kube-api-access-c88ht\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.516813 4921 generic.go:334] "Generic (PLEG): container finished" podID="7f0d3174-f5a3-4712-908e-00a4aa1a0169" containerID="5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0" exitCode=0 Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.516920 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zrdx7" event={"ID":"7f0d3174-f5a3-4712-908e-00a4aa1a0169","Type":"ContainerDied","Data":"5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0"} Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.516962 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zrdx7" event={"ID":"7f0d3174-f5a3-4712-908e-00a4aa1a0169","Type":"ContainerDied","Data":"a04d61777a23a019aa47fec2b9021177449e3b4aca09d5ed04b3824f0693c2ae"} Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.516990 4921 scope.go:117] "RemoveContainer" containerID="5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.517522 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zrdx7" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.519029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8cvjq" event={"ID":"9a5083f2-1437-4efc-b0ed-3e18a4bc8a81","Type":"ContainerStarted","Data":"ecfcdff3e4d096e405c7b5afdb882b4244018d5631e59e42a0529d93e5b8b6c7"} Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.519068 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8cvjq" event={"ID":"9a5083f2-1437-4efc-b0ed-3e18a4bc8a81","Type":"ContainerStarted","Data":"0eb9ee2d1d91a189b07a6eca560c157b6f5316e97f0a5d828080c42c3fd16537"} Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.540897 4921 scope.go:117] "RemoveContainer" containerID="5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0" Mar 18 12:27:59 crc kubenswrapper[4921]: E0318 12:27:59.541512 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0\": container with ID starting with 5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0 not found: ID does not exist" containerID="5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.541562 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0"} err="failed to get container status \"5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0\": rpc error: code = NotFound desc = could not find container \"5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0\": container with ID starting with 5316eba83a358daaab6dfbac45cc614c142f05e121fafc363780619d3cc23fc0 not found: ID does not exist" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.549895 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-8cvjq" podStartSLOduration=2.491699783 podStartE2EDuration="2.549878631s" podCreationTimestamp="2026-03-18 12:27:57 +0000 UTC" firstStartedPulling="2026-03-18 12:27:58.62119896 +0000 UTC m=+1098.171119609" lastFinishedPulling="2026-03-18 12:27:58.679377818 +0000 UTC m=+1098.229298457" observedRunningTime="2026-03-18 12:27:59.538404576 +0000 UTC m=+1099.088325215" watchObservedRunningTime="2026-03-18 12:27:59.549878631 +0000 UTC m=+1099.099799270" Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.556811 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zrdx7"] Mar 18 12:27:59 crc kubenswrapper[4921]: I0318 12:27:59.560754 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-zrdx7"] Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.135286 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563948-ffjpc"] Mar 18 12:28:00 crc kubenswrapper[4921]: E0318 12:28:00.135680 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0d3174-f5a3-4712-908e-00a4aa1a0169" containerName="registry-server" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.135702 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0d3174-f5a3-4712-908e-00a4aa1a0169" containerName="registry-server" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.135917 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0d3174-f5a3-4712-908e-00a4aa1a0169" containerName="registry-server" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.136687 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.139602 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.139888 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.140310 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.154639 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-ffjpc"] Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.335693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86zfr\" (UniqueName: \"kubernetes.io/projected/e08e8f99-4229-4ba3-979f-0a1cc8e3406f-kube-api-access-86zfr\") pod \"auto-csr-approver-29563948-ffjpc\" (UID: \"e08e8f99-4229-4ba3-979f-0a1cc8e3406f\") " pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.437350 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86zfr\" (UniqueName: \"kubernetes.io/projected/e08e8f99-4229-4ba3-979f-0a1cc8e3406f-kube-api-access-86zfr\") pod \"auto-csr-approver-29563948-ffjpc\" (UID: \"e08e8f99-4229-4ba3-979f-0a1cc8e3406f\") " pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.475817 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86zfr\" (UniqueName: \"kubernetes.io/projected/e08e8f99-4229-4ba3-979f-0a1cc8e3406f-kube-api-access-86zfr\") pod \"auto-csr-approver-29563948-ffjpc\" (UID: \"e08e8f99-4229-4ba3-979f-0a1cc8e3406f\") " pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.750870 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:00 crc kubenswrapper[4921]: I0318 12:28:00.940154 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-ffjpc"] Mar 18 12:28:01 crc kubenswrapper[4921]: I0318 12:28:01.217894 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0d3174-f5a3-4712-908e-00a4aa1a0169" path="/var/lib/kubelet/pods/7f0d3174-f5a3-4712-908e-00a4aa1a0169/volumes" Mar 18 12:28:01 crc kubenswrapper[4921]: I0318 12:28:01.556596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" event={"ID":"e08e8f99-4229-4ba3-979f-0a1cc8e3406f","Type":"ContainerStarted","Data":"28a4b9125bc4bf86b034b4b714c28285c3b5eb62fba6baa436277ce8afb2079a"} Mar 18 12:28:02 crc kubenswrapper[4921]: I0318 12:28:02.565663 4921 generic.go:334] "Generic (PLEG): container finished" podID="e08e8f99-4229-4ba3-979f-0a1cc8e3406f" containerID="c1aa42dc58755d43fc6bf73449b283c67dd1eb8ba8fa8c52819510041fa53028" exitCode=0 Mar 18 12:28:02 crc kubenswrapper[4921]: I0318 12:28:02.565730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" event={"ID":"e08e8f99-4229-4ba3-979f-0a1cc8e3406f","Type":"ContainerDied","Data":"c1aa42dc58755d43fc6bf73449b283c67dd1eb8ba8fa8c52819510041fa53028"} Mar 18 12:28:03 crc kubenswrapper[4921]: I0318 12:28:03.883854 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:03 crc kubenswrapper[4921]: I0318 12:28:03.998609 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86zfr\" (UniqueName: \"kubernetes.io/projected/e08e8f99-4229-4ba3-979f-0a1cc8e3406f-kube-api-access-86zfr\") pod \"e08e8f99-4229-4ba3-979f-0a1cc8e3406f\" (UID: \"e08e8f99-4229-4ba3-979f-0a1cc8e3406f\") " Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.006375 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08e8f99-4229-4ba3-979f-0a1cc8e3406f-kube-api-access-86zfr" (OuterVolumeSpecName: "kube-api-access-86zfr") pod "e08e8f99-4229-4ba3-979f-0a1cc8e3406f" (UID: "e08e8f99-4229-4ba3-979f-0a1cc8e3406f"). InnerVolumeSpecName "kube-api-access-86zfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.100425 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86zfr\" (UniqueName: \"kubernetes.io/projected/e08e8f99-4229-4ba3-979f-0a1cc8e3406f-kube-api-access-86zfr\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.580484 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" event={"ID":"e08e8f99-4229-4ba3-979f-0a1cc8e3406f","Type":"ContainerDied","Data":"28a4b9125bc4bf86b034b4b714c28285c3b5eb62fba6baa436277ce8afb2079a"} Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.580572 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-ffjpc" Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.580571 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28a4b9125bc4bf86b034b4b714c28285c3b5eb62fba6baa436277ce8afb2079a" Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.943722 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-84ssn"] Mar 18 12:28:04 crc kubenswrapper[4921]: I0318 12:28:04.951668 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-84ssn"] Mar 18 12:28:05 crc kubenswrapper[4921]: I0318 12:28:05.217723 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c168d5bc-4e2c-4365-9951-b9bd84c375d9" path="/var/lib/kubelet/pods/c168d5bc-4e2c-4365-9951-b9bd84c375d9/volumes" Mar 18 12:28:08 crc kubenswrapper[4921]: I0318 12:28:08.208686 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:28:08 crc kubenswrapper[4921]: I0318 12:28:08.209199 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:28:08 crc kubenswrapper[4921]: I0318 12:28:08.247102 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:28:08 crc kubenswrapper[4921]: I0318 12:28:08.641244 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-8cvjq" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.023304 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh"] Mar 18 12:28:15 crc kubenswrapper[4921]: E0318 12:28:15.024241 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08e8f99-4229-4ba3-979f-0a1cc8e3406f" containerName="oc" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.024270 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08e8f99-4229-4ba3-979f-0a1cc8e3406f" containerName="oc" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.025313 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08e8f99-4229-4ba3-979f-0a1cc8e3406f" containerName="oc" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.027646 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.031225 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-s8dkr" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.033606 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh"] Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.152527 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j447z\" (UniqueName: \"kubernetes.io/projected/c30738d0-ecb7-42ef-8428-f2f06233f338-kube-api-access-j447z\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.152664 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.152721 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.254685 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.254768 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.254903 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j447z\" (UniqueName: \"kubernetes.io/projected/c30738d0-ecb7-42ef-8428-f2f06233f338-kube-api-access-j447z\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.255421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.255733 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.277349 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j447z\" (UniqueName: \"kubernetes.io/projected/c30738d0-ecb7-42ef-8428-f2f06233f338-kube-api-access-j447z\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.347207 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:15 crc kubenswrapper[4921]: I0318 12:28:15.765985 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh"] Mar 18 12:28:15 crc kubenswrapper[4921]: W0318 12:28:15.776209 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30738d0_ecb7_42ef_8428_f2f06233f338.slice/crio-2f076fadc7e894b980689f459040bf175680694ac115394edaf0af6bcf81b26a WatchSource:0}: Error finding container 2f076fadc7e894b980689f459040bf175680694ac115394edaf0af6bcf81b26a: Status 404 returned error can't find the container with id 2f076fadc7e894b980689f459040bf175680694ac115394edaf0af6bcf81b26a Mar 18 12:28:16 crc kubenswrapper[4921]: I0318 12:28:16.667783 4921 generic.go:334] "Generic (PLEG): container finished" podID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerID="555f502cb4f89cd22fe869623b8f0c8fd1c5fb8dae008312d44d0acc3f6ac894" exitCode=0 Mar 18 12:28:16 crc kubenswrapper[4921]: I0318 12:28:16.667846 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" event={"ID":"c30738d0-ecb7-42ef-8428-f2f06233f338","Type":"ContainerDied","Data":"555f502cb4f89cd22fe869623b8f0c8fd1c5fb8dae008312d44d0acc3f6ac894"} Mar 18 12:28:16 crc kubenswrapper[4921]: I0318 12:28:16.667911 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" event={"ID":"c30738d0-ecb7-42ef-8428-f2f06233f338","Type":"ContainerStarted","Data":"2f076fadc7e894b980689f459040bf175680694ac115394edaf0af6bcf81b26a"} Mar 18 12:28:17 crc kubenswrapper[4921]: I0318 12:28:17.677307 4921 generic.go:334] "Generic (PLEG): container finished" podID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerID="b8508243fcf59af103390937d03123364f75de2b763c4eda5e63ddba68daf09b" exitCode=0 Mar 18 12:28:17 crc kubenswrapper[4921]: I0318 12:28:17.677353 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" event={"ID":"c30738d0-ecb7-42ef-8428-f2f06233f338","Type":"ContainerDied","Data":"b8508243fcf59af103390937d03123364f75de2b763c4eda5e63ddba68daf09b"} Mar 18 12:28:18 crc kubenswrapper[4921]: I0318 12:28:18.684831 4921 generic.go:334] "Generic (PLEG): container finished" podID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerID="0280a3cac0055525c04c6756c4fb3d0afb02947e5331bfe568f0a738ffed988d" exitCode=0 Mar 18 12:28:18 crc kubenswrapper[4921]: I0318 12:28:18.685096 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" event={"ID":"c30738d0-ecb7-42ef-8428-f2f06233f338","Type":"ContainerDied","Data":"0280a3cac0055525c04c6756c4fb3d0afb02947e5331bfe568f0a738ffed988d"} Mar 18 12:28:19 crc kubenswrapper[4921]: I0318 12:28:19.976146 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.124669 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-bundle\") pod \"c30738d0-ecb7-42ef-8428-f2f06233f338\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.124753 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-util\") pod \"c30738d0-ecb7-42ef-8428-f2f06233f338\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.124814 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j447z\" (UniqueName: \"kubernetes.io/projected/c30738d0-ecb7-42ef-8428-f2f06233f338-kube-api-access-j447z\") pod \"c30738d0-ecb7-42ef-8428-f2f06233f338\" (UID: \"c30738d0-ecb7-42ef-8428-f2f06233f338\") " Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.125446 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-bundle" (OuterVolumeSpecName: "bundle") pod "c30738d0-ecb7-42ef-8428-f2f06233f338" (UID: "c30738d0-ecb7-42ef-8428-f2f06233f338"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.131048 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30738d0-ecb7-42ef-8428-f2f06233f338-kube-api-access-j447z" (OuterVolumeSpecName: "kube-api-access-j447z") pod "c30738d0-ecb7-42ef-8428-f2f06233f338" (UID: "c30738d0-ecb7-42ef-8428-f2f06233f338"). InnerVolumeSpecName "kube-api-access-j447z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.140708 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-util" (OuterVolumeSpecName: "util") pod "c30738d0-ecb7-42ef-8428-f2f06233f338" (UID: "c30738d0-ecb7-42ef-8428-f2f06233f338"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.227220 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.227272 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j447z\" (UniqueName: \"kubernetes.io/projected/c30738d0-ecb7-42ef-8428-f2f06233f338-kube-api-access-j447z\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.227306 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c30738d0-ecb7-42ef-8428-f2f06233f338-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.701790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" event={"ID":"c30738d0-ecb7-42ef-8428-f2f06233f338","Type":"ContainerDied","Data":"2f076fadc7e894b980689f459040bf175680694ac115394edaf0af6bcf81b26a"} Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.702145 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f076fadc7e894b980689f459040bf175680694ac115394edaf0af6bcf81b26a" Mar 18 12:28:20 crc kubenswrapper[4921]: I0318 12:28:20.701860 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.028863 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w"] Mar 18 12:28:27 crc kubenswrapper[4921]: E0318 12:28:27.029539 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="pull" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.029558 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="pull" Mar 18 12:28:27 crc kubenswrapper[4921]: E0318 12:28:27.029576 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="util" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.029583 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="util" Mar 18 12:28:27 crc kubenswrapper[4921]: E0318 12:28:27.029598 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="extract" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.029605 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="extract" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.029721 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30738d0-ecb7-42ef-8428-f2f06233f338" containerName="extract" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.030225 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.033132 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-qszl2" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.086444 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w"] Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.120604 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs2wd\" (UniqueName: \"kubernetes.io/projected/bd46d074-f312-48a3-ae07-5889a432d9bd-kube-api-access-zs2wd\") pod \"openstack-operator-controller-init-68ccf9867-95s7w\" (UID: \"bd46d074-f312-48a3-ae07-5889a432d9bd\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.222144 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs2wd\" (UniqueName: \"kubernetes.io/projected/bd46d074-f312-48a3-ae07-5889a432d9bd-kube-api-access-zs2wd\") pod \"openstack-operator-controller-init-68ccf9867-95s7w\" (UID: \"bd46d074-f312-48a3-ae07-5889a432d9bd\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.251301 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs2wd\" (UniqueName: \"kubernetes.io/projected/bd46d074-f312-48a3-ae07-5889a432d9bd-kube-api-access-zs2wd\") pod \"openstack-operator-controller-init-68ccf9867-95s7w\" (UID: \"bd46d074-f312-48a3-ae07-5889a432d9bd\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.348519 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.563062 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w"] Mar 18 12:28:27 crc kubenswrapper[4921]: I0318 12:28:27.754882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" event={"ID":"bd46d074-f312-48a3-ae07-5889a432d9bd","Type":"ContainerStarted","Data":"b479122f7a8a35b83c10e291d5fbfcb0876c85ce30ae81f3c4cbf7df6e0894a2"} Mar 18 12:28:32 crc kubenswrapper[4921]: I0318 12:28:32.786425 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" event={"ID":"bd46d074-f312-48a3-ae07-5889a432d9bd","Type":"ContainerStarted","Data":"de17c989efd6c46e9ba89d6cb8b37f50aed4fe61d40e6f5bec8dc9f8eec6ce31"} Mar 18 12:28:32 crc kubenswrapper[4921]: I0318 12:28:32.787071 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:32 crc kubenswrapper[4921]: I0318 12:28:32.822439 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" podStartSLOduration=2.57625607 podStartE2EDuration="6.822423496s" podCreationTimestamp="2026-03-18 12:28:26 +0000 UTC" firstStartedPulling="2026-03-18 12:28:27.572616564 +0000 UTC m=+1127.122537203" lastFinishedPulling="2026-03-18 12:28:31.81878399 +0000 UTC m=+1131.368704629" observedRunningTime="2026-03-18 12:28:32.818566307 +0000 UTC m=+1132.368486956" watchObservedRunningTime="2026-03-18 12:28:32.822423496 +0000 UTC m=+1132.372344135" Mar 18 12:28:37 crc kubenswrapper[4921]: I0318 12:28:37.351331 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-95s7w" Mar 18 12:28:47 crc kubenswrapper[4921]: I0318 12:28:47.081701 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:28:47 crc kubenswrapper[4921]: I0318 12:28:47.082271 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:28:48 crc kubenswrapper[4921]: I0318 12:28:48.414324 4921 scope.go:117] "RemoveContainer" containerID="7ced53491cfad4ac4d58b1a2ad9addb90fd487837b693e043699afdb7f0c658c" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.236732 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.238729 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.241081 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xz696" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.247942 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.258297 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.259051 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.270761 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-p2scx" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.278256 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.279237 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.281520 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-h55jh" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.288494 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.289425 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.292379 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4nsz9" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.302446 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.312145 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.332758 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.333634 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.341898 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.343967 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qgvrk" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.356409 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.381534 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.385377 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.387749 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-f2wks" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.401302 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbcj\" (UniqueName: \"kubernetes.io/projected/aa7a1390-ce2f-4102-b998-d4dcf56abf25-kube-api-access-rzbcj\") pod \"heat-operator-controller-manager-67dd5f86f5-nsnz7\" (UID: \"aa7a1390-ce2f-4102-b998-d4dcf56abf25\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.401566 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpxs\" (UniqueName: \"kubernetes.io/projected/e7fc79ba-0394-4b4d-94d3-7fb983330881-kube-api-access-dfpxs\") pod \"barbican-operator-controller-manager-59bc569d95-rsjkx\" (UID: \"e7fc79ba-0394-4b4d-94d3-7fb983330881\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.401894 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxqkg\" (UniqueName: \"kubernetes.io/projected/04deb0db-06f5-428b-8c1f-b1c4585d3b79-kube-api-access-fxqkg\") pod \"glance-operator-controller-manager-79df6bcc97-c7hxm\" (UID: \"04deb0db-06f5-428b-8c1f-b1c4585d3b79\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.402049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rklsl\" (UniqueName: \"kubernetes.io/projected/2c163c6b-f034-4ba4-bd7e-ab170b41cc23-kube-api-access-rklsl\") pod \"cinder-operator-controller-manager-8d58dc466-c4m65\" (UID: \"2c163c6b-f034-4ba4-bd7e-ab170b41cc23\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.402151 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjg2n\" (UniqueName: \"kubernetes.io/projected/bc0b0e69-c1d0-4e53-bc54-12ee0ccab318-kube-api-access-qjg2n\") pod \"designate-operator-controller-manager-588d4d986b-2f42g\" (UID: \"bc0b0e69-c1d0-4e53-bc54-12ee0ccab318\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.410139 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.413960 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.416577 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-c5fng" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.416879 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.419536 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.426871 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.455401 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.456737 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.458571 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x85rl" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.465177 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.477558 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.479245 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.480886 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kb8nh" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.508289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rklsl\" (UniqueName: \"kubernetes.io/projected/2c163c6b-f034-4ba4-bd7e-ab170b41cc23-kube-api-access-rklsl\") pod \"cinder-operator-controller-manager-8d58dc466-c4m65\" (UID: \"2c163c6b-f034-4ba4-bd7e-ab170b41cc23\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.508336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjg2n\" (UniqueName: \"kubernetes.io/projected/bc0b0e69-c1d0-4e53-bc54-12ee0ccab318-kube-api-access-qjg2n\") pod \"designate-operator-controller-manager-588d4d986b-2f42g\" (UID: \"bc0b0e69-c1d0-4e53-bc54-12ee0ccab318\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.508371 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9sf\" (UniqueName: \"kubernetes.io/projected/55bf4c3f-da09-440f-9d0d-27942727e7eb-kube-api-access-wg9sf\") pod \"horizon-operator-controller-manager-8464cc45fb-cqxrg\" (UID: \"55bf4c3f-da09-440f-9d0d-27942727e7eb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.508390 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrqn8\" (UniqueName: \"kubernetes.io/projected/f923626e-2cdd-413f-8b7d-e983841061da-kube-api-access-hrqn8\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.508412 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbcj\" (UniqueName: \"kubernetes.io/projected/aa7a1390-ce2f-4102-b998-d4dcf56abf25-kube-api-access-rzbcj\") pod \"heat-operator-controller-manager-67dd5f86f5-nsnz7\" (UID: \"aa7a1390-ce2f-4102-b998-d4dcf56abf25\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.508434 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.509085 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpxs\" (UniqueName: \"kubernetes.io/projected/e7fc79ba-0394-4b4d-94d3-7fb983330881-kube-api-access-dfpxs\") pod \"barbican-operator-controller-manager-59bc569d95-rsjkx\" (UID: \"e7fc79ba-0394-4b4d-94d3-7fb983330881\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.509138 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxqkg\" (UniqueName: \"kubernetes.io/projected/04deb0db-06f5-428b-8c1f-b1c4585d3b79-kube-api-access-fxqkg\") pod \"glance-operator-controller-manager-79df6bcc97-c7hxm\" (UID: \"04deb0db-06f5-428b-8c1f-b1c4585d3b79\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.512543 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.517421 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.518168 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.520395 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rdfbh" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.531976 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.532881 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.541218 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.543905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxqkg\" (UniqueName: \"kubernetes.io/projected/04deb0db-06f5-428b-8c1f-b1c4585d3b79-kube-api-access-fxqkg\") pod \"glance-operator-controller-manager-79df6bcc97-c7hxm\" (UID: \"04deb0db-06f5-428b-8c1f-b1c4585d3b79\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.550607 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rklsl\" (UniqueName: \"kubernetes.io/projected/2c163c6b-f034-4ba4-bd7e-ab170b41cc23-kube-api-access-rklsl\") pod \"cinder-operator-controller-manager-8d58dc466-c4m65\" (UID: \"2c163c6b-f034-4ba4-bd7e-ab170b41cc23\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.551137 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xfg8c" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.552389 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpxs\" (UniqueName: \"kubernetes.io/projected/e7fc79ba-0394-4b4d-94d3-7fb983330881-kube-api-access-dfpxs\") pod \"barbican-operator-controller-manager-59bc569d95-rsjkx\" (UID: \"e7fc79ba-0394-4b4d-94d3-7fb983330881\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.563062 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbcj\" (UniqueName: \"kubernetes.io/projected/aa7a1390-ce2f-4102-b998-d4dcf56abf25-kube-api-access-rzbcj\") pod \"heat-operator-controller-manager-67dd5f86f5-nsnz7\" (UID: \"aa7a1390-ce2f-4102-b998-d4dcf56abf25\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.560733 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-vj44m"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.564561 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.565880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjg2n\" (UniqueName: \"kubernetes.io/projected/bc0b0e69-c1d0-4e53-bc54-12ee0ccab318-kube-api-access-qjg2n\") pod \"designate-operator-controller-manager-588d4d986b-2f42g\" (UID: \"bc0b0e69-c1d0-4e53-bc54-12ee0ccab318\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.565947 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.567576 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-zch7l" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.574850 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-vj44m"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.575102 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.590048 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.590871 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.592427 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7p78j" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.599010 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.601719 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2bmz\" (UniqueName: \"kubernetes.io/projected/865b7d13-bcd2-4ff5-ab6a-70dc8b85206b-kube-api-access-m2bmz\") pod \"ironic-operator-controller-manager-6f787dddc9-cb29f\" (UID: \"865b7d13-bcd2-4ff5-ab6a-70dc8b85206b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610835 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfvm\" (UniqueName: \"kubernetes.io/projected/191b3452-75a7-49eb-953f-606943d143eb-kube-api-access-znfvm\") pod \"manila-operator-controller-manager-55f864c847-xwbwb\" (UID: \"191b3452-75a7-49eb-953f-606943d143eb\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610861 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2vs\" (UniqueName: \"kubernetes.io/projected/0749ae95-942e-4331-bf11-707bb1cc131d-kube-api-access-hr2vs\") pod \"keystone-operator-controller-manager-768b96df4c-wpc8r\" (UID: \"0749ae95-942e-4331-bf11-707bb1cc131d\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610894 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9sf\" (UniqueName: \"kubernetes.io/projected/55bf4c3f-da09-440f-9d0d-27942727e7eb-kube-api-access-wg9sf\") pod \"horizon-operator-controller-manager-8464cc45fb-cqxrg\" (UID: \"55bf4c3f-da09-440f-9d0d-27942727e7eb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610912 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrqn8\" (UniqueName: \"kubernetes.io/projected/f923626e-2cdd-413f-8b7d-e983841061da-kube-api-access-hrqn8\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.610987 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfhn2\" (UniqueName: \"kubernetes.io/projected/670fb623-168c-44fc-a437-daaaa77ea3cd-kube-api-access-pfhn2\") pod \"mariadb-operator-controller-manager-67ccfc9778-png26\" (UID: \"670fb623-168c-44fc-a437-daaaa77ea3cd\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:28:56 crc kubenswrapper[4921]: E0318 12:28:56.613522 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:28:56 crc kubenswrapper[4921]: E0318 12:28:56.613564 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert podName:f923626e-2cdd-413f-8b7d-e983841061da nodeName:}" failed. No retries permitted until 2026-03-18 12:28:57.113549744 +0000 UTC m=+1156.663470383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert") pod "infra-operator-controller-manager-7b9c774f96-bx4rb" (UID: "f923626e-2cdd-413f-8b7d-e983841061da") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.617367 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.618173 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.619018 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.620517 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fh5r6" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.636798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrqn8\" (UniqueName: \"kubernetes.io/projected/f923626e-2cdd-413f-8b7d-e983841061da-kube-api-access-hrqn8\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.640524 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.641034 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9sf\" (UniqueName: \"kubernetes.io/projected/55bf4c3f-da09-440f-9d0d-27942727e7eb-kube-api-access-wg9sf\") pod \"horizon-operator-controller-manager-8464cc45fb-cqxrg\" (UID: \"55bf4c3f-da09-440f-9d0d-27942727e7eb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.641321 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.645485 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.646522 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.654069 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.654375 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h5kmj" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.657052 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-srlqt"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.659381 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.660512 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-srlqt"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.661033 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mh9zp" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.665089 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.665845 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.669265 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8gbpj" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.677647 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.684148 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.685756 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.689793 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fklss" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.690288 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.698378 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.708566 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716265 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kksk\" (UniqueName: \"kubernetes.io/projected/e9b16b0b-6ae7-4f94-947d-d14ccce79710-kube-api-access-7kksk\") pod \"neutron-operator-controller-manager-767865f676-vj44m\" (UID: \"e9b16b0b-6ae7-4f94-947d-d14ccce79710\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716323 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2bmz\" (UniqueName: \"kubernetes.io/projected/865b7d13-bcd2-4ff5-ab6a-70dc8b85206b-kube-api-access-m2bmz\") pod \"ironic-operator-controller-manager-6f787dddc9-cb29f\" (UID: \"865b7d13-bcd2-4ff5-ab6a-70dc8b85206b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716358 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfvm\" (UniqueName: \"kubernetes.io/projected/191b3452-75a7-49eb-953f-606943d143eb-kube-api-access-znfvm\") pod \"manila-operator-controller-manager-55f864c847-xwbwb\" (UID: \"191b3452-75a7-49eb-953f-606943d143eb\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2vs\" (UniqueName: \"kubernetes.io/projected/0749ae95-942e-4331-bf11-707bb1cc131d-kube-api-access-hr2vs\") pod \"keystone-operator-controller-manager-768b96df4c-wpc8r\" (UID: \"0749ae95-942e-4331-bf11-707bb1cc131d\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716417 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjwhr\" (UniqueName: \"kubernetes.io/projected/a830f92b-2266-4b87-a165-a8db80990181-kube-api-access-fjwhr\") pod \"nova-operator-controller-manager-5d488d59fb-2nnr8\" (UID: \"a830f92b-2266-4b87-a165-a8db80990181\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716478 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmd9\" (UniqueName: \"kubernetes.io/projected/7c559a09-bcdc-4c4d-b326-1e91e920b262-kube-api-access-qkmd9\") pod \"octavia-operator-controller-manager-5b9f45d989-zst7p\" (UID: \"7c559a09-bcdc-4c4d-b326-1e91e920b262\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.716508 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfhn2\" (UniqueName: \"kubernetes.io/projected/670fb623-168c-44fc-a437-daaaa77ea3cd-kube-api-access-pfhn2\") pod \"mariadb-operator-controller-manager-67ccfc9778-png26\" (UID: \"670fb623-168c-44fc-a437-daaaa77ea3cd\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.722287 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.755506 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2vs\" (UniqueName: \"kubernetes.io/projected/0749ae95-942e-4331-bf11-707bb1cc131d-kube-api-access-hr2vs\") pod \"keystone-operator-controller-manager-768b96df4c-wpc8r\" (UID: \"0749ae95-942e-4331-bf11-707bb1cc131d\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.760308 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfvm\" (UniqueName: \"kubernetes.io/projected/191b3452-75a7-49eb-953f-606943d143eb-kube-api-access-znfvm\") pod \"manila-operator-controller-manager-55f864c847-xwbwb\" (UID: \"191b3452-75a7-49eb-953f-606943d143eb\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.766606 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfhn2\" (UniqueName: \"kubernetes.io/projected/670fb623-168c-44fc-a437-daaaa77ea3cd-kube-api-access-pfhn2\") pod \"mariadb-operator-controller-manager-67ccfc9778-png26\" (UID: \"670fb623-168c-44fc-a437-daaaa77ea3cd\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.774752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2bmz\" (UniqueName: \"kubernetes.io/projected/865b7d13-bcd2-4ff5-ab6a-70dc8b85206b-kube-api-access-m2bmz\") pod \"ironic-operator-controller-manager-6f787dddc9-cb29f\" (UID: \"865b7d13-bcd2-4ff5-ab6a-70dc8b85206b\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.789463 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.801542 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.854813 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmd9\" (UniqueName: \"kubernetes.io/projected/7c559a09-bcdc-4c4d-b326-1e91e920b262-kube-api-access-qkmd9\") pod \"octavia-operator-controller-manager-5b9f45d989-zst7p\" (UID: \"7c559a09-bcdc-4c4d-b326-1e91e920b262\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855589 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9bc6\" (UniqueName: \"kubernetes.io/projected/958f7207-3507-4bbc-88ac-4f0e7f19f154-kube-api-access-c9bc6\") pod \"swift-operator-controller-manager-c674c5965-n8b8w\" (UID: \"958f7207-3507-4bbc-88ac-4f0e7f19f154\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76664\" (UniqueName: \"kubernetes.io/projected/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-kube-api-access-76664\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855649 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kksk\" (UniqueName: \"kubernetes.io/projected/e9b16b0b-6ae7-4f94-947d-d14ccce79710-kube-api-access-7kksk\") pod \"neutron-operator-controller-manager-767865f676-vj44m\" (UID: \"e9b16b0b-6ae7-4f94-947d-d14ccce79710\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855686 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xss45\" (UniqueName: \"kubernetes.io/projected/df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb-kube-api-access-xss45\") pod \"placement-operator-controller-manager-5784578c99-bhx7s\" (UID: \"df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855717 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trw6\" (UniqueName: \"kubernetes.io/projected/34010ed3-fc84-42ad-9011-160d4a107029-kube-api-access-5trw6\") pod \"ovn-operator-controller-manager-884679f54-srlqt\" (UID: \"34010ed3-fc84-42ad-9011-160d4a107029\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjwhr\" (UniqueName: \"kubernetes.io/projected/a830f92b-2266-4b87-a165-a8db80990181-kube-api-access-fjwhr\") pod \"nova-operator-controller-manager-5d488d59fb-2nnr8\" (UID: \"a830f92b-2266-4b87-a165-a8db80990181\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.855841 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.865523 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.866548 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.879970 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-x4w2x" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.892262 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kksk\" (UniqueName: \"kubernetes.io/projected/e9b16b0b-6ae7-4f94-947d-d14ccce79710-kube-api-access-7kksk\") pod \"neutron-operator-controller-manager-767865f676-vj44m\" (UID: \"e9b16b0b-6ae7-4f94-947d-d14ccce79710\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.894341 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjwhr\" (UniqueName: \"kubernetes.io/projected/a830f92b-2266-4b87-a165-a8db80990181-kube-api-access-fjwhr\") pod \"nova-operator-controller-manager-5d488d59fb-2nnr8\" (UID: \"a830f92b-2266-4b87-a165-a8db80990181\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.900413 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.913822 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmd9\" (UniqueName: \"kubernetes.io/projected/7c559a09-bcdc-4c4d-b326-1e91e920b262-kube-api-access-qkmd9\") pod \"octavia-operator-controller-manager-5b9f45d989-zst7p\" (UID: \"7c559a09-bcdc-4c4d-b326-1e91e920b262\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.937093 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.938290 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.942647 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jzx8p" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.952542 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.957442 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqzm\" (UniqueName: \"kubernetes.io/projected/ad4ad53b-44b8-46d4-8ef2-04e7859c3e60-kube-api-access-rrqzm\") pod \"test-operator-controller-manager-5c5cb9c4d7-n67kw\" (UID: \"ad4ad53b-44b8-46d4-8ef2-04e7859c3e60\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.957814 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22btr\" (UniqueName: \"kubernetes.io/projected/49d9ed92-040c-45cd-ba21-a5b96f07fe95-kube-api-access-22btr\") pod \"telemetry-operator-controller-manager-d6b694c5-ps7nq\" (UID: \"49d9ed92-040c-45cd-ba21-a5b96f07fe95\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.957962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9bc6\" (UniqueName: \"kubernetes.io/projected/958f7207-3507-4bbc-88ac-4f0e7f19f154-kube-api-access-c9bc6\") pod \"swift-operator-controller-manager-c674c5965-n8b8w\" (UID: \"958f7207-3507-4bbc-88ac-4f0e7f19f154\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.958059 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76664\" (UniqueName: \"kubernetes.io/projected/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-kube-api-access-76664\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.958360 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xss45\" (UniqueName: \"kubernetes.io/projected/df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb-kube-api-access-xss45\") pod \"placement-operator-controller-manager-5784578c99-bhx7s\" (UID: \"df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.958487 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trw6\" (UniqueName: \"kubernetes.io/projected/34010ed3-fc84-42ad-9011-160d4a107029-kube-api-access-5trw6\") pod \"ovn-operator-controller-manager-884679f54-srlqt\" (UID: \"34010ed3-fc84-42ad-9011-160d4a107029\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.958656 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h"] Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.958837 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:56 crc kubenswrapper[4921]: E0318 12:28:56.959196 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:28:56 crc kubenswrapper[4921]: E0318 12:28:56.959383 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert podName:6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba nodeName:}" failed. No retries permitted until 2026-03-18 12:28:57.459362928 +0000 UTC m=+1157.009283567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-wb587" (UID: "6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.959831 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.968865 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-92jqh" Mar 18 12:28:56 crc kubenswrapper[4921]: I0318 12:28:56.989953 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.033479 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.044980 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9bc6\" (UniqueName: \"kubernetes.io/projected/958f7207-3507-4bbc-88ac-4f0e7f19f154-kube-api-access-c9bc6\") pod \"swift-operator-controller-manager-c674c5965-n8b8w\" (UID: \"958f7207-3507-4bbc-88ac-4f0e7f19f154\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.052527 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76664\" (UniqueName: \"kubernetes.io/projected/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-kube-api-access-76664\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.063828 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.066874 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trw6\" (UniqueName: \"kubernetes.io/projected/34010ed3-fc84-42ad-9011-160d4a107029-kube-api-access-5trw6\") pod \"ovn-operator-controller-manager-884679f54-srlqt\" (UID: \"34010ed3-fc84-42ad-9011-160d4a107029\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.068320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqzm\" (UniqueName: \"kubernetes.io/projected/ad4ad53b-44b8-46d4-8ef2-04e7859c3e60-kube-api-access-rrqzm\") pod \"test-operator-controller-manager-5c5cb9c4d7-n67kw\" (UID: \"ad4ad53b-44b8-46d4-8ef2-04e7859c3e60\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.074857 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22btr\" (UniqueName: \"kubernetes.io/projected/49d9ed92-040c-45cd-ba21-a5b96f07fe95-kube-api-access-22btr\") pod \"telemetry-operator-controller-manager-d6b694c5-ps7nq\" (UID: \"49d9ed92-040c-45cd-ba21-a5b96f07fe95\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.077519 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.079871 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xss45\" (UniqueName: \"kubernetes.io/projected/df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb-kube-api-access-xss45\") pod \"placement-operator-controller-manager-5784578c99-bhx7s\" (UID: \"df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.085987 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.104072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqzm\" (UniqueName: \"kubernetes.io/projected/ad4ad53b-44b8-46d4-8ef2-04e7859c3e60-kube-api-access-rrqzm\") pod \"test-operator-controller-manager-5c5cb9c4d7-n67kw\" (UID: \"ad4ad53b-44b8-46d4-8ef2-04e7859c3e60\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.110606 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.128751 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22btr\" (UniqueName: \"kubernetes.io/projected/49d9ed92-040c-45cd-ba21-a5b96f07fe95-kube-api-access-22btr\") pod \"telemetry-operator-controller-manager-d6b694c5-ps7nq\" (UID: \"49d9ed92-040c-45cd-ba21-a5b96f07fe95\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.150438 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.151336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.156495 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wt69w" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.160473 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.160682 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.168366 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.176882 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.176949 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtw5m\" (UniqueName: \"kubernetes.io/projected/f21bbed2-2ad1-468c-806f-eda2d4f2264e-kube-api-access-wtw5m\") pod \"watcher-operator-controller-manager-6c4d75f7f9-h8b2h\" (UID: \"f21bbed2-2ad1-468c-806f-eda2d4f2264e\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.178579 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wx4\" (UniqueName: \"kubernetes.io/projected/09b26899-4ee2-482d-b190-b57c5d4cdfd3-kube-api-access-j4wx4\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.178632 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.178698 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.178901 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.178946 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert podName:f923626e-2cdd-413f-8b7d-e983841061da nodeName:}" failed. No retries permitted until 2026-03-18 12:28:58.17893135 +0000 UTC m=+1157.728851989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert") pod "infra-operator-controller-manager-7b9c774f96-bx4rb" (UID: "f923626e-2cdd-413f-8b7d-e983841061da") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.185473 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.199216 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.231324 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.248075 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.248849 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.258697 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-2xmml" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.272524 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.289963 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdmqv\" (UniqueName: \"kubernetes.io/projected/12fa5cc9-3f33-4574-831e-87596175e789-kube-api-access-pdmqv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wzcxk\" (UID: \"12fa5cc9-3f33-4574-831e-87596175e789\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.290035 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.290092 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtw5m\" (UniqueName: \"kubernetes.io/projected/f21bbed2-2ad1-468c-806f-eda2d4f2264e-kube-api-access-wtw5m\") pod \"watcher-operator-controller-manager-6c4d75f7f9-h8b2h\" (UID: \"f21bbed2-2ad1-468c-806f-eda2d4f2264e\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.290139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wx4\" (UniqueName: \"kubernetes.io/projected/09b26899-4ee2-482d-b190-b57c5d4cdfd3-kube-api-access-j4wx4\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.290186 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.290347 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.290401 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:28:57.790385314 +0000 UTC m=+1157.340305953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "metrics-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.291170 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.291201 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:28:57.791194816 +0000 UTC m=+1157.341115455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.329098 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wx4\" (UniqueName: \"kubernetes.io/projected/09b26899-4ee2-482d-b190-b57c5d4cdfd3-kube-api-access-j4wx4\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.332142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtw5m\" (UniqueName: \"kubernetes.io/projected/f21bbed2-2ad1-468c-806f-eda2d4f2264e-kube-api-access-wtw5m\") pod \"watcher-operator-controller-manager-6c4d75f7f9-h8b2h\" (UID: \"f21bbed2-2ad1-468c-806f-eda2d4f2264e\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.358051 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.391622 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmqv\" (UniqueName: \"kubernetes.io/projected/12fa5cc9-3f33-4574-831e-87596175e789-kube-api-access-pdmqv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wzcxk\" (UID: \"12fa5cc9-3f33-4574-831e-87596175e789\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.416811 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmqv\" (UniqueName: \"kubernetes.io/projected/12fa5cc9-3f33-4574-831e-87596175e789-kube-api-access-pdmqv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wzcxk\" (UID: \"12fa5cc9-3f33-4574-831e-87596175e789\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.496920 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.497186 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.497241 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert podName:6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba nodeName:}" failed. No retries permitted until 2026-03-18 12:28:58.497224236 +0000 UTC m=+1158.047144885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-wb587" (UID: "6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.516004 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.546846 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.561492 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65"] Mar 18 12:28:57 crc kubenswrapper[4921]: W0318 12:28:57.581342 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7fc79ba_0394_4b4d_94d3_7fb983330881.slice/crio-34b640999e8617704d9ffc37804b9de93a6e4906ed7c2b3c9bcb9f26623f9ed4 WatchSource:0}: Error finding container 34b640999e8617704d9ffc37804b9de93a6e4906ed7c2b3c9bcb9f26623f9ed4: Status 404 returned error can't find the container with id 34b640999e8617704d9ffc37804b9de93a6e4906ed7c2b3c9bcb9f26623f9ed4 Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.584552 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.685548 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g"] Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.692802 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.803057 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.803430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.803256 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.803591 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.803600 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:28:58.803576203 +0000 UTC m=+1158.353496842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "webhook-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: E0318 12:28:57.803637 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:28:58.803622305 +0000 UTC m=+1158.353542944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "metrics-server-cert" not found Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.948396 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" event={"ID":"2c163c6b-f034-4ba4-bd7e-ab170b41cc23","Type":"ContainerStarted","Data":"edfc44b6c3b26cb10ac5d514d1dfdd1a931d60dbf7b4086dc2a3a38ba22c0f53"} Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.948988 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" event={"ID":"e7fc79ba-0394-4b4d-94d3-7fb983330881","Type":"ContainerStarted","Data":"34b640999e8617704d9ffc37804b9de93a6e4906ed7c2b3c9bcb9f26623f9ed4"} Mar 18 12:28:57 crc kubenswrapper[4921]: I0318 12:28:57.950695 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" event={"ID":"bc0b0e69-c1d0-4e53-bc54-12ee0ccab318","Type":"ContainerStarted","Data":"599740d87f63dd541f70fe7cb60d53c9d7c8559269b5f51c40050c2680ffe4bd"} Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.063280 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.081996 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.095156 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.112989 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.150418 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.171357 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.204703 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.212849 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.213002 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.213061 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert podName:f923626e-2cdd-413f-8b7d-e983841061da nodeName:}" failed. No retries permitted until 2026-03-18 12:29:00.213044089 +0000 UTC m=+1159.762964728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert") pod "infra-operator-controller-manager-7b9c774f96-bx4rb" (UID: "f923626e-2cdd-413f-8b7d-e983841061da") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.516506 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.516697 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.516756 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert podName:6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba nodeName:}" failed. No retries permitted until 2026-03-18 12:29:00.516737441 +0000 UTC m=+1160.066658080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-wb587" (UID: "6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.579165 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.586040 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-vj44m"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.593261 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.600898 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.616084 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-srlqt"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.629887 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h"] Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.647237 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk"] Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.655584 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-22btr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-ps7nq_openstack-operators(49d9ed92-040c-45cd-ba21-a5b96f07fe95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.659738 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq"] Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.659933 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" podUID="49d9ed92-040c-45cd-ba21-a5b96f07fe95" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.673668 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pdmqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wzcxk_openstack-operators(12fa5cc9-3f33-4574-831e-87596175e789): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.673847 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtw5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-h8b2h_openstack-operators(f21bbed2-2ad1-468c-806f-eda2d4f2264e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.673925 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p"] Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.676995 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" podUID="f21bbed2-2ad1-468c-806f-eda2d4f2264e" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.677074 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" podUID="12fa5cc9-3f33-4574-831e-87596175e789" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.682135 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8"] Mar 18 12:28:58 crc kubenswrapper[4921]: W0318 12:28:58.726911 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda830f92b_2266_4b87_a165_a8db80990181.slice/crio-e5b3ce468b7f2d41e454ca09f4b5396c15091a907551cd2014946a63e608155e WatchSource:0}: Error finding container e5b3ce468b7f2d41e454ca09f4b5396c15091a907551cd2014946a63e608155e: Status 404 returned error can't find the container with id e5b3ce468b7f2d41e454ca09f4b5396c15091a907551cd2014946a63e608155e Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.773469 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fjwhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-2nnr8_openstack-operators(a830f92b-2266-4b87-a165-a8db80990181): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.774037 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qkmd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-zst7p_openstack-operators(7c559a09-bcdc-4c4d-b326-1e91e920b262): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.775333 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" podUID="7c559a09-bcdc-4c4d-b326-1e91e920b262" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.775950 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" podUID="a830f92b-2266-4b87-a165-a8db80990181" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.826190 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.826326 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.826422 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.826477 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.826506 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:00.826487785 +0000 UTC m=+1160.376408424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "metrics-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.826540 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:00.826521736 +0000 UTC m=+1160.376442455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "webhook-server-cert" not found Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.967012 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" event={"ID":"ad4ad53b-44b8-46d4-8ef2-04e7859c3e60","Type":"ContainerStarted","Data":"a9b46997b45f5c23f19069913d1e47c33ccbff0dd5a2159eeb3e49572b54e1c8"} Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.971242 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" event={"ID":"191b3452-75a7-49eb-953f-606943d143eb","Type":"ContainerStarted","Data":"b1ec8099326191052c25e48da9ff860d9e45fe0ac168ae9da110acced27103c9"} Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.972611 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" event={"ID":"f21bbed2-2ad1-468c-806f-eda2d4f2264e","Type":"ContainerStarted","Data":"65e9c9a0799565c5ec46c4b9dc7fc77f7bff0c44a88b5a796e2d97f1ab6984de"} Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.974588 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" podUID="f21bbed2-2ad1-468c-806f-eda2d4f2264e" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.975832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" event={"ID":"7c559a09-bcdc-4c4d-b326-1e91e920b262","Type":"ContainerStarted","Data":"ebf07b60d07a440ad76e2b2fa87c45ad109a67a0017676be562b91741af086a8"} Mar 18 12:28:58 crc kubenswrapper[4921]: E0318 12:28:58.977035 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" podUID="7c559a09-bcdc-4c4d-b326-1e91e920b262" Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.977982 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" event={"ID":"aa7a1390-ce2f-4102-b998-d4dcf56abf25","Type":"ContainerStarted","Data":"93d39b79db32302f94654218d4e8d36844c22ada5402d0ba575ab364611a4dc7"} Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.983421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" event={"ID":"e9b16b0b-6ae7-4f94-947d-d14ccce79710","Type":"ContainerStarted","Data":"369aac24494baa0ac5da8e876e65a9fb6a5e6253e069661bb9d6fe758f337cc2"} Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.992678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" event={"ID":"df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb","Type":"ContainerStarted","Data":"86b5228fb264036ea36400b52ab0f418b04790c2c92d9b1d1086fc71ae0a8e0b"} Mar 18 12:28:58 crc kubenswrapper[4921]: I0318 12:28:58.996206 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" event={"ID":"958f7207-3507-4bbc-88ac-4f0e7f19f154","Type":"ContainerStarted","Data":"fcf988a57ada78ecbfa2743ad5df8bf549f8be369b07603bea3cdab900c27407"} Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.000211 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" event={"ID":"49d9ed92-040c-45cd-ba21-a5b96f07fe95","Type":"ContainerStarted","Data":"ed5571ceb5c800fd900cb5f62d956b06c9d47abd9572306df1749a3169c61dc3"} Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.002685 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" event={"ID":"34010ed3-fc84-42ad-9011-160d4a107029","Type":"ContainerStarted","Data":"d0695ff8930303824948bda9c14bb7a9da10c556ef080477399f8ccf66153e5e"} Mar 18 12:28:59 crc kubenswrapper[4921]: E0318 12:28:59.003863 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" podUID="49d9ed92-040c-45cd-ba21-a5b96f07fe95" Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.004886 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" event={"ID":"55bf4c3f-da09-440f-9d0d-27942727e7eb","Type":"ContainerStarted","Data":"33a7e2bb392a2cb2a05dcd6f66885b0a0e76265e899fa7c551bf2ffe5beffef5"} Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.005974 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" event={"ID":"670fb623-168c-44fc-a437-daaaa77ea3cd","Type":"ContainerStarted","Data":"762363badead93c1e63199185836c91eb6111f32b7c1110834887627aa2c94ef"} Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.009406 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" event={"ID":"04deb0db-06f5-428b-8c1f-b1c4585d3b79","Type":"ContainerStarted","Data":"6b2056a32b7139d027985ca8ae21209c483a75b0aeb00619403efb8b257c62b5"} Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.011498 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" event={"ID":"a830f92b-2266-4b87-a165-a8db80990181","Type":"ContainerStarted","Data":"e5b3ce468b7f2d41e454ca09f4b5396c15091a907551cd2014946a63e608155e"} Mar 18 12:28:59 crc kubenswrapper[4921]: E0318 12:28:59.013416 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" podUID="a830f92b-2266-4b87-a165-a8db80990181" Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.014629 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" event={"ID":"12fa5cc9-3f33-4574-831e-87596175e789","Type":"ContainerStarted","Data":"72ac64e437f4dc9e08f03896b559ffa9a0aed46cb92c9c2084769fea162af7d3"} Mar 18 12:28:59 crc kubenswrapper[4921]: E0318 12:28:59.016048 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" podUID="12fa5cc9-3f33-4574-831e-87596175e789" Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.016908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" event={"ID":"0749ae95-942e-4331-bf11-707bb1cc131d","Type":"ContainerStarted","Data":"bd16afbe0e2310f5fa552b45ba398c7dab2ad0dcebfeedf86584f9a982141bb8"} Mar 18 12:28:59 crc kubenswrapper[4921]: I0318 12:28:59.020913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" event={"ID":"865b7d13-bcd2-4ff5-ab6a-70dc8b85206b","Type":"ContainerStarted","Data":"73d3cc17cd69154deb1de06c436429952d9943294acb722c9c68b43414406994"} Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.090762 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" podUID="12fa5cc9-3f33-4574-831e-87596175e789" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.091292 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" podUID="f21bbed2-2ad1-468c-806f-eda2d4f2264e" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.091363 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" podUID="7c559a09-bcdc-4c4d-b326-1e91e920b262" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.091404 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" podUID="49d9ed92-040c-45cd-ba21-a5b96f07fe95" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.091438 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" podUID="a830f92b-2266-4b87-a165-a8db80990181" Mar 18 12:29:00 crc kubenswrapper[4921]: I0318 12:29:00.248398 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.248570 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.248638 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert podName:f923626e-2cdd-413f-8b7d-e983841061da nodeName:}" failed. No retries permitted until 2026-03-18 12:29:04.24862039 +0000 UTC m=+1163.798541029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert") pod "infra-operator-controller-manager-7b9c774f96-bx4rb" (UID: "f923626e-2cdd-413f-8b7d-e983841061da") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: I0318 12:29:00.552692 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.552829 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.552892 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert podName:6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba nodeName:}" failed. No retries permitted until 2026-03-18 12:29:04.552875588 +0000 UTC m=+1164.102796227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-wb587" (UID: "6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: I0318 12:29:00.858590 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:00 crc kubenswrapper[4921]: I0318 12:29:00.858680 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.858792 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.858838 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:04.858824245 +0000 UTC m=+1164.408744884 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "metrics-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.859076 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:00 crc kubenswrapper[4921]: E0318 12:29:00.859178 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:04.859158094 +0000 UTC m=+1164.409078733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: I0318 12:29:04.315337 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.315538 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.315871 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert podName:f923626e-2cdd-413f-8b7d-e983841061da nodeName:}" failed. No retries permitted until 2026-03-18 12:29:12.315852493 +0000 UTC m=+1171.865773132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert") pod "infra-operator-controller-manager-7b9c774f96-bx4rb" (UID: "f923626e-2cdd-413f-8b7d-e983841061da") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: I0318 12:29:04.620270 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.620427 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.620482 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert podName:6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba nodeName:}" failed. No retries permitted until 2026-03-18 12:29:12.620464942 +0000 UTC m=+1172.170385581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-wb587" (UID: "6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: I0318 12:29:04.924990 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:04 crc kubenswrapper[4921]: I0318 12:29:04.925134 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.925271 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.925326 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:12.925310467 +0000 UTC m=+1172.475231106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "webhook-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.925334 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:29:04 crc kubenswrapper[4921]: E0318 12:29:04.925437 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:12.92541466 +0000 UTC m=+1172.475335349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "metrics-server-cert" not found Mar 18 12:29:09 crc kubenswrapper[4921]: E0318 12:29:09.464263 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d" Mar 18 12:29:09 crc kubenswrapper[4921]: E0318 12:29:09.464903 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dfpxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-59bc569d95-rsjkx_openstack-operators(e7fc79ba-0394-4b4d-94d3-7fb983330881): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:09 crc kubenswrapper[4921]: E0318 12:29:09.466101 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" podUID="e7fc79ba-0394-4b4d-94d3-7fb983330881" Mar 18 12:29:10 crc kubenswrapper[4921]: E0318 12:29:10.150392 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" podUID="e7fc79ba-0394-4b4d-94d3-7fb983330881" Mar 18 12:29:11 crc kubenswrapper[4921]: E0318 12:29:11.446332 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 18 12:29:11 crc kubenswrapper[4921]: E0318 12:29:11.446543 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xss45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-bhx7s_openstack-operators(df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:11 crc kubenswrapper[4921]: E0318 12:29:11.448037 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" podUID="df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.158183 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" podUID="df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.266658 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.266867 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7kksk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-vj44m_openstack-operators(e9b16b0b-6ae7-4f94-947d-d14ccce79710): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.268387 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" podUID="e9b16b0b-6ae7-4f94-947d-d14ccce79710" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.341320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.351793 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f923626e-2cdd-413f-8b7d-e983841061da-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bx4rb\" (UID: \"f923626e-2cdd-413f-8b7d-e983841061da\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.638696 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.644936 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.649489 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-wb587\" (UID: \"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.745750 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.948852 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.949042 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rzbcj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-nsnz7_openstack-operators(aa7a1390-ce2f-4102-b998-d4dcf56abf25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.949409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.949490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.949638 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.949694 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs podName:09b26899-4ee2-482d-b190-b57c5d4cdfd3 nodeName:}" failed. No retries permitted until 2026-03-18 12:29:28.949676399 +0000 UTC m=+1188.499597038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-xzxsj" (UID: "09b26899-4ee2-482d-b190-b57c5d4cdfd3") : secret "webhook-server-cert" not found Mar 18 12:29:12 crc kubenswrapper[4921]: E0318 12:29:12.950369 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" podUID="aa7a1390-ce2f-4102-b998-d4dcf56abf25" Mar 18 12:29:12 crc kubenswrapper[4921]: I0318 12:29:12.956545 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:13 crc kubenswrapper[4921]: E0318 12:29:13.164818 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" podUID="aa7a1390-ce2f-4102-b998-d4dcf56abf25" Mar 18 12:29:13 crc kubenswrapper[4921]: E0318 12:29:13.165237 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" podUID="e9b16b0b-6ae7-4f94-947d-d14ccce79710" Mar 18 12:29:13 crc kubenswrapper[4921]: E0318 12:29:13.577657 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 18 12:29:13 crc kubenswrapper[4921]: E0318 12:29:13.577994 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hr2vs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-wpc8r_openstack-operators(0749ae95-942e-4331-bf11-707bb1cc131d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:29:13 crc kubenswrapper[4921]: E0318 12:29:13.579181 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" podUID="0749ae95-942e-4331-bf11-707bb1cc131d" Mar 18 12:29:14 crc kubenswrapper[4921]: E0318 12:29:14.172045 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" podUID="0749ae95-942e-4331-bf11-707bb1cc131d" Mar 18 12:29:14 crc kubenswrapper[4921]: I0318 12:29:14.998728 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb"] Mar 18 12:29:15 crc kubenswrapper[4921]: W0318 12:29:15.297192 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf923626e_2cdd_413f_8b7d_e983841061da.slice/crio-fa248557cadb4147f35d6dd905c4ffb7709bb33c50786ab39b06dc4e83dea254 WatchSource:0}: Error finding container fa248557cadb4147f35d6dd905c4ffb7709bb33c50786ab39b06dc4e83dea254: Status 404 returned error can't find the container with id fa248557cadb4147f35d6dd905c4ffb7709bb33c50786ab39b06dc4e83dea254 Mar 18 12:29:15 crc kubenswrapper[4921]: I0318 12:29:15.726332 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587"] Mar 18 12:29:16 crc kubenswrapper[4921]: I0318 12:29:16.181130 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" event={"ID":"f923626e-2cdd-413f-8b7d-e983841061da","Type":"ContainerStarted","Data":"fa248557cadb4147f35d6dd905c4ffb7709bb33c50786ab39b06dc4e83dea254"} Mar 18 12:29:16 crc kubenswrapper[4921]: W0318 12:29:16.399765 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f79bf2e_45a8_42d8_a3e5_a5322b80a0ba.slice/crio-b86a3aa2d5ff7be95f6929e13b033c923b66554934b9eacbe253ce7b695e23db WatchSource:0}: Error finding container b86a3aa2d5ff7be95f6929e13b033c923b66554934b9eacbe253ce7b695e23db: Status 404 returned error can't find the container with id b86a3aa2d5ff7be95f6929e13b033c923b66554934b9eacbe253ce7b695e23db Mar 18 12:29:17 crc kubenswrapper[4921]: I0318 12:29:17.081718 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:29:17 crc kubenswrapper[4921]: I0318 12:29:17.082359 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:29:17 crc kubenswrapper[4921]: I0318 12:29:17.190076 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" event={"ID":"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba","Type":"ContainerStarted","Data":"b86a3aa2d5ff7be95f6929e13b033c923b66554934b9eacbe253ce7b695e23db"} Mar 18 12:29:17 crc kubenswrapper[4921]: I0318 12:29:17.193351 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" event={"ID":"865b7d13-bcd2-4ff5-ab6a-70dc8b85206b","Type":"ContainerStarted","Data":"89cc0f29b98c6edb10ff637048df929f0492af8ab7c1cd16e2593f15852079d9"} Mar 18 12:29:17 crc kubenswrapper[4921]: I0318 12:29:17.193693 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:29:17 crc kubenswrapper[4921]: I0318 12:29:17.217031 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" podStartSLOduration=5.826964697 podStartE2EDuration="21.217011353s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.157566179 +0000 UTC m=+1157.707486818" lastFinishedPulling="2026-03-18 12:29:13.547612835 +0000 UTC m=+1173.097533474" observedRunningTime="2026-03-18 12:29:17.212427073 +0000 UTC m=+1176.762347712" watchObservedRunningTime="2026-03-18 12:29:17.217011353 +0000 UTC m=+1176.766931992" Mar 18 12:29:18 crc kubenswrapper[4921]: I0318 12:29:18.208170 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" event={"ID":"bc0b0e69-c1d0-4e53-bc54-12ee0ccab318","Type":"ContainerStarted","Data":"8a5974d13cec84879a7e3accdd90e2cf6194c57e103632fe07b145b72ad0c070"} Mar 18 12:29:18 crc kubenswrapper[4921]: I0318 12:29:18.208218 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:29:18 crc kubenswrapper[4921]: I0318 12:29:18.248344 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" podStartSLOduration=6.476774581 podStartE2EDuration="22.248313981s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:57.776222429 +0000 UTC m=+1157.326143058" lastFinishedPulling="2026-03-18 12:29:13.547761819 +0000 UTC m=+1173.097682458" observedRunningTime="2026-03-18 12:29:18.237131795 +0000 UTC m=+1177.787052434" watchObservedRunningTime="2026-03-18 12:29:18.248313981 +0000 UTC m=+1177.798234620" Mar 18 12:29:22 crc kubenswrapper[4921]: I0318 12:29:22.250435 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" event={"ID":"2c163c6b-f034-4ba4-bd7e-ab170b41cc23","Type":"ContainerStarted","Data":"312701838fe9b6f6eb6ce69cce11a2b544b89e3757674eaa1d6ed810582d1c76"} Mar 18 12:29:22 crc kubenswrapper[4921]: I0318 12:29:22.251044 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:29:22 crc kubenswrapper[4921]: I0318 12:29:22.251915 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" event={"ID":"04deb0db-06f5-428b-8c1f-b1c4585d3b79","Type":"ContainerStarted","Data":"a2f70f6c5f1923a3cf35eea83d9c09578a5389d6bee7347fa275eb5163909e66"} Mar 18 12:29:22 crc kubenswrapper[4921]: I0318 12:29:22.252346 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:29:22 crc kubenswrapper[4921]: I0318 12:29:22.266043 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" podStartSLOduration=10.331125882 podStartE2EDuration="26.266009533s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:57.612453186 +0000 UTC m=+1157.162373825" lastFinishedPulling="2026-03-18 12:29:13.547336837 +0000 UTC m=+1173.097257476" observedRunningTime="2026-03-18 12:29:22.264676605 +0000 UTC m=+1181.814597254" watchObservedRunningTime="2026-03-18 12:29:22.266009533 +0000 UTC m=+1181.815930172" Mar 18 12:29:22 crc kubenswrapper[4921]: I0318 12:29:22.290470 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" podStartSLOduration=10.851122034 podStartE2EDuration="26.290443384s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.108317876 +0000 UTC m=+1157.658238515" lastFinishedPulling="2026-03-18 12:29:13.547639226 +0000 UTC m=+1173.097559865" observedRunningTime="2026-03-18 12:29:22.287842201 +0000 UTC m=+1181.837762840" watchObservedRunningTime="2026-03-18 12:29:22.290443384 +0000 UTC m=+1181.840364043" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.258388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" event={"ID":"f21bbed2-2ad1-468c-806f-eda2d4f2264e","Type":"ContainerStarted","Data":"525fec44fe4a6380ce5b0ed22c7197c4d81971aba899c7bb36e0d2eb09a37266"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.258882 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.260399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" event={"ID":"a830f92b-2266-4b87-a165-a8db80990181","Type":"ContainerStarted","Data":"4ef25a4602863e6f83aaa9d503070e80ef98c6c5bfc5e3f97d289e858ec4d88f"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.260633 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.261648 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" event={"ID":"7c559a09-bcdc-4c4d-b326-1e91e920b262","Type":"ContainerStarted","Data":"28fc59962a65f87d0eda02414637472d923a7bad40c17ee85da3e9aa1fb23a73"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.261930 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.263565 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" event={"ID":"12fa5cc9-3f33-4574-831e-87596175e789","Type":"ContainerStarted","Data":"c5e1add6ba3dd62f2a5ce192b39e15e79e943da5991b64c77181d8150871ed1f"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.264983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" event={"ID":"f923626e-2cdd-413f-8b7d-e983841061da","Type":"ContainerStarted","Data":"cbc9d948b05463ce1e97d3985a4c1abe5630b45de749b25d659120a06f8490aa"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.265337 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.266338 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" event={"ID":"ad4ad53b-44b8-46d4-8ef2-04e7859c3e60","Type":"ContainerStarted","Data":"5558dcc7a97194221f2bb004b6054ef3da27c601c732e5de42777c6c4fd0eb01"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.266468 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.267626 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" event={"ID":"6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba","Type":"ContainerStarted","Data":"1816873cc37f613a739f5258357548f4beb792ce932324be2a83941fd581d838"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.268253 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.269064 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" event={"ID":"670fb623-168c-44fc-a437-daaaa77ea3cd","Type":"ContainerStarted","Data":"e9b171a84a0de44ac0f5dc167766f9a5fb9d35b59e06611e324990aae2768e07"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.269198 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.270632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" event={"ID":"958f7207-3507-4bbc-88ac-4f0e7f19f154","Type":"ContainerStarted","Data":"a246c300f211ab0e875940fa6eea6ee7da165d0f97c9042d75b7e5ae9565928e"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.271130 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.272624 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" event={"ID":"49d9ed92-040c-45cd-ba21-a5b96f07fe95","Type":"ContainerStarted","Data":"eb7536a6654ea0e3a0f302b0de5a41d9982144bca0febd6f77763c4e8fc26274"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.272982 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.274258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" event={"ID":"55bf4c3f-da09-440f-9d0d-27942727e7eb","Type":"ContainerStarted","Data":"e5c3758a06519a83333b7531aaa2fb5cac119721e82f36a5d8debdff08643297"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.274586 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.276865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" event={"ID":"34010ed3-fc84-42ad-9011-160d4a107029","Type":"ContainerStarted","Data":"c9057155c1c826355efd9e11b4c5a0bdf3f15fca1455060c69ac5bc3be8f7c19"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.276965 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.307222 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" event={"ID":"191b3452-75a7-49eb-953f-606943d143eb","Type":"ContainerStarted","Data":"d9ed15cc89f92f6362eb5082efeb41aa21e178b56513e21c895eb6f061473404"} Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.307555 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.476746 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" podStartSLOduration=5.524711175 podStartE2EDuration="27.476727228s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.773884076 +0000 UTC m=+1158.323804715" lastFinishedPulling="2026-03-18 12:29:20.725900129 +0000 UTC m=+1180.275820768" observedRunningTime="2026-03-18 12:29:23.475194095 +0000 UTC m=+1183.025114734" watchObservedRunningTime="2026-03-18 12:29:23.476727228 +0000 UTC m=+1183.026647867" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.477527 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" podStartSLOduration=20.78533404 podStartE2EDuration="27.47751918s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:29:15.299478611 +0000 UTC m=+1174.849399270" lastFinishedPulling="2026-03-18 12:29:21.991663771 +0000 UTC m=+1181.541584410" observedRunningTime="2026-03-18 12:29:23.443152888 +0000 UTC m=+1182.993073537" watchObservedRunningTime="2026-03-18 12:29:23.47751918 +0000 UTC m=+1183.027439819" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.521490 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" podStartSLOduration=7.256548684 podStartE2EDuration="27.521465234s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.773330521 +0000 UTC m=+1158.323251160" lastFinishedPulling="2026-03-18 12:29:19.038247061 +0000 UTC m=+1178.588167710" observedRunningTime="2026-03-18 12:29:23.515272719 +0000 UTC m=+1183.065193378" watchObservedRunningTime="2026-03-18 12:29:23.521465234 +0000 UTC m=+1183.071385873" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.593891 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" podStartSLOduration=8.633973555 podStartE2EDuration="27.593870422s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.655429265 +0000 UTC m=+1158.205349904" lastFinishedPulling="2026-03-18 12:29:17.615326132 +0000 UTC m=+1177.165246771" observedRunningTime="2026-03-18 12:29:23.553323805 +0000 UTC m=+1183.103244444" watchObservedRunningTime="2026-03-18 12:29:23.593870422 +0000 UTC m=+1183.143791061" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.594857 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" podStartSLOduration=8.654052933 podStartE2EDuration="27.59485171s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.673748163 +0000 UTC m=+1158.223668802" lastFinishedPulling="2026-03-18 12:29:17.61454694 +0000 UTC m=+1177.164467579" observedRunningTime="2026-03-18 12:29:23.582190732 +0000 UTC m=+1183.132111401" watchObservedRunningTime="2026-03-18 12:29:23.59485171 +0000 UTC m=+1183.144772349" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.606933 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" podStartSLOduration=10.960674763 podStartE2EDuration="27.606909591s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.624792698 +0000 UTC m=+1158.174713337" lastFinishedPulling="2026-03-18 12:29:15.271027506 +0000 UTC m=+1174.820948165" observedRunningTime="2026-03-18 12:29:23.602712712 +0000 UTC m=+1183.152633371" watchObservedRunningTime="2026-03-18 12:29:23.606909591 +0000 UTC m=+1183.156830230" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.622611 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" podStartSLOduration=10.548544323 podStartE2EDuration="27.622596515s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.196867071 +0000 UTC m=+1157.746787710" lastFinishedPulling="2026-03-18 12:29:15.270919273 +0000 UTC m=+1174.820839902" observedRunningTime="2026-03-18 12:29:23.620592128 +0000 UTC m=+1183.170512777" watchObservedRunningTime="2026-03-18 12:29:23.622596515 +0000 UTC m=+1183.172517154" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.649889 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" podStartSLOduration=12.211918155 podStartE2EDuration="27.649873647s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.110988911 +0000 UTC m=+1157.660909550" lastFinishedPulling="2026-03-18 12:29:13.548944403 +0000 UTC m=+1173.098865042" observedRunningTime="2026-03-18 12:29:23.642538659 +0000 UTC m=+1183.192459298" watchObservedRunningTime="2026-03-18 12:29:23.649873647 +0000 UTC m=+1183.199794286" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.673979 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" podStartSLOduration=11.033826453 podStartE2EDuration="27.673961878s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.630798358 +0000 UTC m=+1158.180718997" lastFinishedPulling="2026-03-18 12:29:15.270933783 +0000 UTC m=+1174.820854422" observedRunningTime="2026-03-18 12:29:23.671388435 +0000 UTC m=+1183.221309084" watchObservedRunningTime="2026-03-18 12:29:23.673961878 +0000 UTC m=+1183.223882517" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.724672 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" podStartSLOduration=22.07609622 podStartE2EDuration="27.724658503s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:29:16.436231003 +0000 UTC m=+1175.986151632" lastFinishedPulling="2026-03-18 12:29:22.084793276 +0000 UTC m=+1181.634713915" observedRunningTime="2026-03-18 12:29:23.724095417 +0000 UTC m=+1183.274016076" watchObservedRunningTime="2026-03-18 12:29:23.724658503 +0000 UTC m=+1183.274579142" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.807174 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" podStartSLOduration=11.404376158 podStartE2EDuration="27.807156237s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.18869122 +0000 UTC m=+1157.738611859" lastFinishedPulling="2026-03-18 12:29:14.591471299 +0000 UTC m=+1174.141391938" observedRunningTime="2026-03-18 12:29:23.770004896 +0000 UTC m=+1183.319925535" watchObservedRunningTime="2026-03-18 12:29:23.807156237 +0000 UTC m=+1183.357076876" Mar 18 12:29:23 crc kubenswrapper[4921]: I0318 12:29:23.810323 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wzcxk" podStartSLOduration=3.424063645 podStartE2EDuration="26.810316706s" podCreationTimestamp="2026-03-18 12:28:57 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.673520827 +0000 UTC m=+1158.223441466" lastFinishedPulling="2026-03-18 12:29:22.059773888 +0000 UTC m=+1181.609694527" observedRunningTime="2026-03-18 12:29:23.805377596 +0000 UTC m=+1183.355298235" watchObservedRunningTime="2026-03-18 12:29:23.810316706 +0000 UTC m=+1183.360237345" Mar 18 12:29:24 crc kubenswrapper[4921]: I0318 12:29:24.237761 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" podStartSLOduration=13.321381805 podStartE2EDuration="28.237726269s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.639826103 +0000 UTC m=+1158.189746742" lastFinishedPulling="2026-03-18 12:29:13.556170567 +0000 UTC m=+1173.106091206" observedRunningTime="2026-03-18 12:29:23.842079805 +0000 UTC m=+1183.392000444" watchObservedRunningTime="2026-03-18 12:29:24.237726269 +0000 UTC m=+1183.787646908" Mar 18 12:29:24 crc kubenswrapper[4921]: I0318 12:29:24.313815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" event={"ID":"e7fc79ba-0394-4b4d-94d3-7fb983330881","Type":"ContainerStarted","Data":"079c814a0fac8a5eb758d6b1fe308a434ecd59781c3e05c2e5ca15e21d82acc9"} Mar 18 12:29:24 crc kubenswrapper[4921]: I0318 12:29:24.334728 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" podStartSLOduration=2.203400675 podStartE2EDuration="28.334710882s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:57.584294149 +0000 UTC m=+1157.134214788" lastFinishedPulling="2026-03-18 12:29:23.715604356 +0000 UTC m=+1183.265524995" observedRunningTime="2026-03-18 12:29:24.33321937 +0000 UTC m=+1183.883140019" watchObservedRunningTime="2026-03-18 12:29:24.334710882 +0000 UTC m=+1183.884631521" Mar 18 12:29:25 crc kubenswrapper[4921]: I0318 12:29:25.320146 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" event={"ID":"df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb","Type":"ContainerStarted","Data":"2e2495c129bfcbf75f714f3278d8803b25d1a76cb59ddc71c0803f8eaeefe46f"} Mar 18 12:29:25 crc kubenswrapper[4921]: I0318 12:29:25.320666 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:29:25 crc kubenswrapper[4921]: I0318 12:29:25.339115 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" podStartSLOduration=3.3627284570000002 podStartE2EDuration="29.339091869s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.62592497 +0000 UTC m=+1158.175845609" lastFinishedPulling="2026-03-18 12:29:24.602288372 +0000 UTC m=+1184.152209021" observedRunningTime="2026-03-18 12:29:25.333995044 +0000 UTC m=+1184.883915693" watchObservedRunningTime="2026-03-18 12:29:25.339091869 +0000 UTC m=+1184.889012518" Mar 18 12:29:26 crc kubenswrapper[4921]: I0318 12:29:26.334610 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" event={"ID":"e9b16b0b-6ae7-4f94-947d-d14ccce79710","Type":"ContainerStarted","Data":"f8d5abaa76239519f1befcf689bc4869fc61588ec66658284598afe2146c6ff4"} Mar 18 12:29:26 crc kubenswrapper[4921]: I0318 12:29:26.334862 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:29:26 crc kubenswrapper[4921]: I0318 12:29:26.361207 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" podStartSLOduration=3.227713848 podStartE2EDuration="30.361185449s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.620434015 +0000 UTC m=+1158.170354654" lastFinishedPulling="2026-03-18 12:29:25.753905616 +0000 UTC m=+1185.303826255" observedRunningTime="2026-03-18 12:29:26.35450593 +0000 UTC m=+1185.904426569" watchObservedRunningTime="2026-03-18 12:29:26.361185449 +0000 UTC m=+1185.911106088" Mar 18 12:29:26 crc kubenswrapper[4921]: I0318 12:29:26.576485 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:29:26 crc kubenswrapper[4921]: I0318 12:29:26.622641 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2f42g" Mar 18 12:29:26 crc kubenswrapper[4921]: I0318 12:29:26.792809 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cb29f" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.037324 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-png26" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.081583 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-2nnr8" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.088769 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-zst7p" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.115755 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-n67kw" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.188206 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-srlqt" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.233966 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-n8b8w" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.362039 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-ps7nq" Mar 18 12:29:27 crc kubenswrapper[4921]: I0318 12:29:27.519643 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-h8b2h" Mar 18 12:29:28 crc kubenswrapper[4921]: I0318 12:29:28.991367 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:28 crc kubenswrapper[4921]: I0318 12:29:28.998152 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09b26899-4ee2-482d-b190-b57c5d4cdfd3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-xzxsj\" (UID: \"09b26899-4ee2-482d-b190-b57c5d4cdfd3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:29 crc kubenswrapper[4921]: I0318 12:29:29.066888 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:29 crc kubenswrapper[4921]: I0318 12:29:29.613906 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj"] Mar 18 12:29:29 crc kubenswrapper[4921]: W0318 12:29:29.621343 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b26899_4ee2_482d_b190_b57c5d4cdfd3.slice/crio-83dcf444736885ba9fa2c412c848a89bf6e7d6c63659c492c4e41786101d4789 WatchSource:0}: Error finding container 83dcf444736885ba9fa2c412c848a89bf6e7d6c63659c492c4e41786101d4789: Status 404 returned error can't find the container with id 83dcf444736885ba9fa2c412c848a89bf6e7d6c63659c492c4e41786101d4789 Mar 18 12:29:30 crc kubenswrapper[4921]: I0318 12:29:30.361643 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" event={"ID":"09b26899-4ee2-482d-b190-b57c5d4cdfd3","Type":"ContainerStarted","Data":"83dcf444736885ba9fa2c412c848a89bf6e7d6c63659c492c4e41786101d4789"} Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:32.655814 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bx4rb" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:32.760272 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-wb587" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.402281 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" event={"ID":"09b26899-4ee2-482d-b190-b57c5d4cdfd3","Type":"ContainerStarted","Data":"d4d2f8f0558049d989cb2d98b33af9d4dc37bfd4696097144a49c18deef10c31"} Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.403350 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.440392 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" podStartSLOduration=40.440373718000004 podStartE2EDuration="40.440373718s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:29:36.440231434 +0000 UTC m=+1195.990152083" watchObservedRunningTime="2026-03-18 12:29:36.440373718 +0000 UTC m=+1195.990294367" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.579833 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-rsjkx" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.604176 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-c4m65" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.643432 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-c7hxm" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.726744 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-cqxrg" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:36.860034 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xwbwb" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:37.070670 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-vj44m" Mar 18 12:29:38 crc kubenswrapper[4921]: I0318 12:29:37.202637 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-bhx7s" Mar 18 12:29:39 crc kubenswrapper[4921]: I0318 12:29:39.424863 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" event={"ID":"aa7a1390-ce2f-4102-b998-d4dcf56abf25","Type":"ContainerStarted","Data":"0208d30915b74491ae79535dcbadc023bbaa0cd0863b4837a9cf005d63161a97"} Mar 18 12:29:39 crc kubenswrapper[4921]: I0318 12:29:39.426331 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:29:39 crc kubenswrapper[4921]: I0318 12:29:39.427733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" event={"ID":"0749ae95-942e-4331-bf11-707bb1cc131d","Type":"ContainerStarted","Data":"ae511039f8abf9096ad320c9af980bb1da16cf3aa162139b5347aaaf929aec0f"} Mar 18 12:29:39 crc kubenswrapper[4921]: I0318 12:29:39.428049 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:29:39 crc kubenswrapper[4921]: I0318 12:29:39.446407 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" podStartSLOduration=3.191663505 podStartE2EDuration="43.446384026s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.095653867 +0000 UTC m=+1157.645574506" lastFinishedPulling="2026-03-18 12:29:38.350374358 +0000 UTC m=+1197.900295027" observedRunningTime="2026-03-18 12:29:39.443841964 +0000 UTC m=+1198.993762603" watchObservedRunningTime="2026-03-18 12:29:39.446384026 +0000 UTC m=+1198.996304685" Mar 18 12:29:39 crc kubenswrapper[4921]: I0318 12:29:39.466842 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" podStartSLOduration=3.224160905 podStartE2EDuration="43.466821454s" podCreationTimestamp="2026-03-18 12:28:56 +0000 UTC" firstStartedPulling="2026-03-18 12:28:58.108309595 +0000 UTC m=+1157.658230234" lastFinishedPulling="2026-03-18 12:29:38.350970154 +0000 UTC m=+1197.900890783" observedRunningTime="2026-03-18 12:29:39.460260269 +0000 UTC m=+1199.010180918" watchObservedRunningTime="2026-03-18 12:29:39.466821454 +0000 UTC m=+1199.016742093" Mar 18 12:29:46 crc kubenswrapper[4921]: I0318 12:29:46.701555 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-nsnz7" Mar 18 12:29:46 crc kubenswrapper[4921]: I0318 12:29:46.804287 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-wpc8r" Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.080997 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.081069 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.081126 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.081919 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0b15d604734e663af7f5ab441b134e0458c09c7238f9cd112cf51b089408bef"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.081988 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://b0b15d604734e663af7f5ab441b134e0458c09c7238f9cd112cf51b089408bef" gracePeriod=600 Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.495939 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="b0b15d604734e663af7f5ab441b134e0458c09c7238f9cd112cf51b089408bef" exitCode=0 Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.496030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"b0b15d604734e663af7f5ab441b134e0458c09c7238f9cd112cf51b089408bef"} Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.496382 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"ecb2d426426fe45a8d3167724569351edcb20678eccb43a43192be5b68165da4"} Mar 18 12:29:47 crc kubenswrapper[4921]: I0318 12:29:47.496404 4921 scope.go:117] "RemoveContainer" containerID="ebcc7bf1aa6f60def18576e51eaa04202bf67a3ba2c684f5b12ee3391d160ae7" Mar 18 12:29:49 crc kubenswrapper[4921]: I0318 12:29:49.072956 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-xzxsj" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.143975 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563950-trmqv"] Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.145774 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.148684 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.149635 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.149823 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.154000 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-trmqv"] Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.159292 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx"] Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.160421 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.164120 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.164161 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.166490 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx"] Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.247207 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-secret-volume\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.247285 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-config-volume\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.247404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg68t\" (UniqueName: \"kubernetes.io/projected/0507122c-6da4-423e-ab92-47829824f5de-kube-api-access-jg68t\") pod \"auto-csr-approver-29563950-trmqv\" (UID: \"0507122c-6da4-423e-ab92-47829824f5de\") " pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.247738 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbzcj\" (UniqueName: \"kubernetes.io/projected/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-kube-api-access-zbzcj\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.349592 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-config-volume\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.349708 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg68t\" (UniqueName: \"kubernetes.io/projected/0507122c-6da4-423e-ab92-47829824f5de-kube-api-access-jg68t\") pod \"auto-csr-approver-29563950-trmqv\" (UID: \"0507122c-6da4-423e-ab92-47829824f5de\") " pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.349757 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbzcj\" (UniqueName: \"kubernetes.io/projected/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-kube-api-access-zbzcj\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.349799 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-secret-volume\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.350541 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-config-volume\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.356622 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-secret-volume\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.376179 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg68t\" (UniqueName: \"kubernetes.io/projected/0507122c-6da4-423e-ab92-47829824f5de-kube-api-access-jg68t\") pod \"auto-csr-approver-29563950-trmqv\" (UID: \"0507122c-6da4-423e-ab92-47829824f5de\") " pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.380025 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbzcj\" (UniqueName: \"kubernetes.io/projected/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-kube-api-access-zbzcj\") pod \"collect-profiles-29563950-9v6cx\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.479010 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.480256 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:00 crc kubenswrapper[4921]: W0318 12:30:00.927375 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde6d5c48_5eeb_4cbb_bdba_c70202696ba6.slice/crio-b8f336d1b142ffc5bbd825ead672a145cbf3a97f54712971f3faef3432084b10 WatchSource:0}: Error finding container b8f336d1b142ffc5bbd825ead672a145cbf3a97f54712971f3faef3432084b10: Status 404 returned error can't find the container with id b8f336d1b142ffc5bbd825ead672a145cbf3a97f54712971f3faef3432084b10 Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.932432 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx"] Mar 18 12:30:00 crc kubenswrapper[4921]: I0318 12:30:00.991876 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-trmqv"] Mar 18 12:30:01 crc kubenswrapper[4921]: W0318 12:30:01.010577 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0507122c_6da4_423e_ab92_47829824f5de.slice/crio-02887cb4055fadd6f8b86bfcaf546dde58f6de645179c5e74cfdb5b1e02acc8f WatchSource:0}: Error finding container 02887cb4055fadd6f8b86bfcaf546dde58f6de645179c5e74cfdb5b1e02acc8f: Status 404 returned error can't find the container with id 02887cb4055fadd6f8b86bfcaf546dde58f6de645179c5e74cfdb5b1e02acc8f Mar 18 12:30:01 crc kubenswrapper[4921]: I0318 12:30:01.591301 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-trmqv" event={"ID":"0507122c-6da4-423e-ab92-47829824f5de","Type":"ContainerStarted","Data":"02887cb4055fadd6f8b86bfcaf546dde58f6de645179c5e74cfdb5b1e02acc8f"} Mar 18 12:30:01 crc kubenswrapper[4921]: I0318 12:30:01.593087 4921 generic.go:334] "Generic (PLEG): container finished" podID="de6d5c48-5eeb-4cbb-bdba-c70202696ba6" containerID="516df1388f4566cb03c09777d3a0acdb98f6a4a13e36916599314b90c1d83912" exitCode=0 Mar 18 12:30:01 crc kubenswrapper[4921]: I0318 12:30:01.593153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" event={"ID":"de6d5c48-5eeb-4cbb-bdba-c70202696ba6","Type":"ContainerDied","Data":"516df1388f4566cb03c09777d3a0acdb98f6a4a13e36916599314b90c1d83912"} Mar 18 12:30:01 crc kubenswrapper[4921]: I0318 12:30:01.593186 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" event={"ID":"de6d5c48-5eeb-4cbb-bdba-c70202696ba6","Type":"ContainerStarted","Data":"b8f336d1b142ffc5bbd825ead672a145cbf3a97f54712971f3faef3432084b10"} Mar 18 12:30:02 crc kubenswrapper[4921]: I0318 12:30:02.851469 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.510125 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-config-volume\") pod \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.510174 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-secret-volume\") pod \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.513515 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-config-volume" (OuterVolumeSpecName: "config-volume") pod "de6d5c48-5eeb-4cbb-bdba-c70202696ba6" (UID: "de6d5c48-5eeb-4cbb-bdba-c70202696ba6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.539332 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "de6d5c48-5eeb-4cbb-bdba-c70202696ba6" (UID: "de6d5c48-5eeb-4cbb-bdba-c70202696ba6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.608313 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" event={"ID":"de6d5c48-5eeb-4cbb-bdba-c70202696ba6","Type":"ContainerDied","Data":"b8f336d1b142ffc5bbd825ead672a145cbf3a97f54712971f3faef3432084b10"} Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.608553 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f336d1b142ffc5bbd825ead672a145cbf3a97f54712971f3faef3432084b10" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.608589 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.611440 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbzcj\" (UniqueName: \"kubernetes.io/projected/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-kube-api-access-zbzcj\") pod \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\" (UID: \"de6d5c48-5eeb-4cbb-bdba-c70202696ba6\") " Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.611995 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.612020 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.616236 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-kube-api-access-zbzcj" (OuterVolumeSpecName: "kube-api-access-zbzcj") pod "de6d5c48-5eeb-4cbb-bdba-c70202696ba6" (UID: "de6d5c48-5eeb-4cbb-bdba-c70202696ba6"). InnerVolumeSpecName "kube-api-access-zbzcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4921]: I0318 12:30:03.712604 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbzcj\" (UniqueName: \"kubernetes.io/projected/de6d5c48-5eeb-4cbb-bdba-c70202696ba6-kube-api-access-zbzcj\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.162585 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fwlks"] Mar 18 12:30:04 crc kubenswrapper[4921]: E0318 12:30:04.162888 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6d5c48-5eeb-4cbb-bdba-c70202696ba6" containerName="collect-profiles" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.162899 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6d5c48-5eeb-4cbb-bdba-c70202696ba6" containerName="collect-profiles" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.163172 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6d5c48-5eeb-4cbb-bdba-c70202696ba6" containerName="collect-profiles" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.163865 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.167290 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.167396 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.167313 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.167713 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-67hlw" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.184811 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fwlks"] Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.219025 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpp8k\" (UniqueName: \"kubernetes.io/projected/d1acabd5-815e-4bd6-949d-628c88f36edd-kube-api-access-tpp8k\") pod \"dnsmasq-dns-675f4bcbfc-fwlks\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.219356 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1acabd5-815e-4bd6-949d-628c88f36edd-config\") pod \"dnsmasq-dns-675f4bcbfc-fwlks\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.233058 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gglnq"] Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.234463 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.236450 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.252319 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gglnq"] Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.320609 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-config\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.320693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2cvt\" (UniqueName: \"kubernetes.io/projected/a3e572d1-e278-46cd-a29f-2592c8565a96-kube-api-access-c2cvt\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.320734 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpp8k\" (UniqueName: \"kubernetes.io/projected/d1acabd5-815e-4bd6-949d-628c88f36edd-kube-api-access-tpp8k\") pod \"dnsmasq-dns-675f4bcbfc-fwlks\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.320759 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1acabd5-815e-4bd6-949d-628c88f36edd-config\") pod \"dnsmasq-dns-675f4bcbfc-fwlks\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.320795 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.321888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1acabd5-815e-4bd6-949d-628c88f36edd-config\") pod \"dnsmasq-dns-675f4bcbfc-fwlks\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.338933 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpp8k\" (UniqueName: \"kubernetes.io/projected/d1acabd5-815e-4bd6-949d-628c88f36edd-kube-api-access-tpp8k\") pod \"dnsmasq-dns-675f4bcbfc-fwlks\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.422220 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.422812 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-config\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.422878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2cvt\" (UniqueName: \"kubernetes.io/projected/a3e572d1-e278-46cd-a29f-2592c8565a96-kube-api-access-c2cvt\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.423662 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.423743 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-config\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.444353 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2cvt\" (UniqueName: \"kubernetes.io/projected/a3e572d1-e278-46cd-a29f-2592c8565a96-kube-api-access-c2cvt\") pod \"dnsmasq-dns-78dd6ddcc-gglnq\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.482153 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.570285 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.631819 4921 generic.go:334] "Generic (PLEG): container finished" podID="0507122c-6da4-423e-ab92-47829824f5de" containerID="89930d04b4c7bed2254497ca2fd22c3ee2be9d23228d09531a0daf68b98760cb" exitCode=0 Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.632198 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-trmqv" event={"ID":"0507122c-6da4-423e-ab92-47829824f5de","Type":"ContainerDied","Data":"89930d04b4c7bed2254497ca2fd22c3ee2be9d23228d09531a0daf68b98760cb"} Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.854832 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gglnq"] Mar 18 12:30:04 crc kubenswrapper[4921]: W0318 12:30:04.855786 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3e572d1_e278_46cd_a29f_2592c8565a96.slice/crio-0982262d26380a8207dbc5e7447768facc373dd92319697da900f20e8a390870 WatchSource:0}: Error finding container 0982262d26380a8207dbc5e7447768facc373dd92319697da900f20e8a390870: Status 404 returned error can't find the container with id 0982262d26380a8207dbc5e7447768facc373dd92319697da900f20e8a390870 Mar 18 12:30:04 crc kubenswrapper[4921]: I0318 12:30:04.915200 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fwlks"] Mar 18 12:30:04 crc kubenswrapper[4921]: W0318 12:30:04.917379 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1acabd5_815e_4bd6_949d_628c88f36edd.slice/crio-c222e9dfa95317c539ecdad1ed23759a165d53a4b72c2a26aabd5575997818ca WatchSource:0}: Error finding container c222e9dfa95317c539ecdad1ed23759a165d53a4b72c2a26aabd5575997818ca: Status 404 returned error can't find the container with id c222e9dfa95317c539ecdad1ed23759a165d53a4b72c2a26aabd5575997818ca Mar 18 12:30:05 crc kubenswrapper[4921]: I0318 12:30:05.660490 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" event={"ID":"d1acabd5-815e-4bd6-949d-628c88f36edd","Type":"ContainerStarted","Data":"c222e9dfa95317c539ecdad1ed23759a165d53a4b72c2a26aabd5575997818ca"} Mar 18 12:30:05 crc kubenswrapper[4921]: I0318 12:30:05.675531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" event={"ID":"a3e572d1-e278-46cd-a29f-2592c8565a96","Type":"ContainerStarted","Data":"0982262d26380a8207dbc5e7447768facc373dd92319697da900f20e8a390870"} Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.027883 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.151032 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg68t\" (UniqueName: \"kubernetes.io/projected/0507122c-6da4-423e-ab92-47829824f5de-kube-api-access-jg68t\") pod \"0507122c-6da4-423e-ab92-47829824f5de\" (UID: \"0507122c-6da4-423e-ab92-47829824f5de\") " Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.161540 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0507122c-6da4-423e-ab92-47829824f5de-kube-api-access-jg68t" (OuterVolumeSpecName: "kube-api-access-jg68t") pod "0507122c-6da4-423e-ab92-47829824f5de" (UID: "0507122c-6da4-423e-ab92-47829824f5de"). InnerVolumeSpecName "kube-api-access-jg68t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.253178 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg68t\" (UniqueName: \"kubernetes.io/projected/0507122c-6da4-423e-ab92-47829824f5de-kube-api-access-jg68t\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.683842 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-trmqv" event={"ID":"0507122c-6da4-423e-ab92-47829824f5de","Type":"ContainerDied","Data":"02887cb4055fadd6f8b86bfcaf546dde58f6de645179c5e74cfdb5b1e02acc8f"} Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.683889 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02887cb4055fadd6f8b86bfcaf546dde58f6de645179c5e74cfdb5b1e02acc8f" Mar 18 12:30:06 crc kubenswrapper[4921]: I0318 12:30:06.683937 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-trmqv" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.090293 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-v2nk7"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.098247 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-v2nk7"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.223355 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc9b502-7736-4dc5-afcc-2d422bce4266" path="/var/lib/kubelet/pods/9dc9b502-7736-4dc5-afcc-2d422bce4266/volumes" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.291432 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fwlks"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.323644 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fscjr"] Mar 18 12:30:07 crc kubenswrapper[4921]: E0318 12:30:07.324091 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0507122c-6da4-423e-ab92-47829824f5de" containerName="oc" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.324130 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0507122c-6da4-423e-ab92-47829824f5de" containerName="oc" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.324313 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0507122c-6da4-423e-ab92-47829824f5de" containerName="oc" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.325220 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.331084 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fscjr"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.477638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-config\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.477729 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.478028 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w79h\" (UniqueName: \"kubernetes.io/projected/c564bb1e-efbd-4ad2-aa41-4f53055cef70-kube-api-access-2w79h\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.585641 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w79h\" (UniqueName: \"kubernetes.io/projected/c564bb1e-efbd-4ad2-aa41-4f53055cef70-kube-api-access-2w79h\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.586164 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-config\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.586294 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.587279 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.588758 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-config\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.591155 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gglnq"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.624953 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w79h\" (UniqueName: \"kubernetes.io/projected/c564bb1e-efbd-4ad2-aa41-4f53055cef70-kube-api-access-2w79h\") pod \"dnsmasq-dns-5ccc8479f9-fscjr\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.628889 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvhbz"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.632586 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.639357 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvhbz"] Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.661539 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.690713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-config\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.690850 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.690911 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8wb6\" (UniqueName: \"kubernetes.io/projected/c73a721b-d0ec-48bb-8106-c35c9d1d3466-kube-api-access-p8wb6\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.793544 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.793627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8wb6\" (UniqueName: \"kubernetes.io/projected/c73a721b-d0ec-48bb-8106-c35c9d1d3466-kube-api-access-p8wb6\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.793678 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-config\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.795402 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-config\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.795822 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.818047 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8wb6\" (UniqueName: \"kubernetes.io/projected/c73a721b-d0ec-48bb-8106-c35c9d1d3466-kube-api-access-p8wb6\") pod \"dnsmasq-dns-57d769cc4f-bvhbz\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.983491 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:07 crc kubenswrapper[4921]: I0318 12:30:07.999583 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fscjr"] Mar 18 12:30:08 crc kubenswrapper[4921]: W0318 12:30:08.018302 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc564bb1e_efbd_4ad2_aa41_4f53055cef70.slice/crio-f25cb37d335df5c03ff7a4d87b8b7ab6913e79178f1926ca6d7fb7004680d309 WatchSource:0}: Error finding container f25cb37d335df5c03ff7a4d87b8b7ab6913e79178f1926ca6d7fb7004680d309: Status 404 returned error can't find the container with id f25cb37d335df5c03ff7a4d87b8b7ab6913e79178f1926ca6d7fb7004680d309 Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.450376 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.452888 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.455277 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.455649 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.455788 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.456218 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x87df" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.456559 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.456808 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.457009 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.476959 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.590152 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvhbz"] Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612547 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612584 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612604 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612650 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612669 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612708 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef935990-b291-43b7-9d56-673b7b05a7a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612727 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmdl\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-kube-api-access-vkmdl\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612757 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef935990-b291-43b7-9d56-673b7b05a7a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.612793 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.706745 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" event={"ID":"c73a721b-d0ec-48bb-8106-c35c9d1d3466","Type":"ContainerStarted","Data":"a4b3a0fb211fe7ef0aa26331234d33f0ecb586b419593c56430f9bec4868febc"} Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.711815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" event={"ID":"c564bb1e-efbd-4ad2-aa41-4f53055cef70","Type":"ContainerStarted","Data":"f25cb37d335df5c03ff7a4d87b8b7ab6913e79178f1926ca6d7fb7004680d309"} Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.713817 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.713872 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.713902 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.713929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.714335 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef935990-b291-43b7-9d56-673b7b05a7a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.714405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmdl\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-kube-api-access-vkmdl\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.714462 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef935990-b291-43b7-9d56-673b7b05a7a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.714498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.715963 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.716955 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.717365 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.717448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.717480 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.718041 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.718006 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.718312 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.718923 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.723900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.730591 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef935990-b291-43b7-9d56-673b7b05a7a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.737011 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.738042 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.738834 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.739499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmdl\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-kube-api-access-vkmdl\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.747552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef935990-b291-43b7-9d56-673b7b05a7a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.753827 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.754016 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.754398 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2gqkc" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.754575 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.754756 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.754602 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.755985 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.756005 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.778807 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.801385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.922635 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923034 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923072 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923099 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923149 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df692663-cc58-4cf1-a05b-566e0152ee90-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923204 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923240 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df692663-cc58-4cf1-a05b-566e0152ee90-pod-info\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923267 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s5fz\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-kube-api-access-7s5fz\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923282 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-server-conf\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:08 crc kubenswrapper[4921]: I0318 12:30:08.923305 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.025869 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.025952 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.025991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026024 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df692663-cc58-4cf1-a05b-566e0152ee90-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026065 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026097 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df692663-cc58-4cf1-a05b-566e0152ee90-pod-info\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026148 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s5fz\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-kube-api-access-7s5fz\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026170 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-server-conf\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026206 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026233 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.026262 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.027709 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.027818 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.028088 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.028131 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.028307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.029600 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-server-conf\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.034243 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df692663-cc58-4cf1-a05b-566e0152ee90-pod-info\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.034340 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.034784 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.036789 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df692663-cc58-4cf1-a05b-566e0152ee90-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.049731 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s5fz\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-kube-api-access-7s5fz\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.059336 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.150770 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.165237 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.639012 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.640482 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.647192 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.647313 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.647439 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.647636 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-b5xqq" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.653618 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.658210 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741239 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741314 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741363 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741429 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-kolla-config\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkscf\" (UniqueName: \"kubernetes.io/projected/bb6e4980-bd4a-455e-924b-739cee9587c9-kube-api-access-xkscf\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741495 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-default\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741517 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.741538 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.843873 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-kolla-config\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.843984 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkscf\" (UniqueName: \"kubernetes.io/projected/bb6e4980-bd4a-455e-924b-739cee9587c9-kube-api-access-xkscf\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.844076 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-default\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.844161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.844230 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.844312 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.844379 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.844478 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.846060 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.846661 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.846915 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-default\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.848123 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.848597 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-kolla-config\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.858657 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.872156 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.875779 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkscf\" (UniqueName: \"kubernetes.io/projected/bb6e4980-bd4a-455e-924b-739cee9587c9-kube-api-access-xkscf\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.911594 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " pod="openstack/openstack-galera-0" Mar 18 12:30:09 crc kubenswrapper[4921]: I0318 12:30:09.993010 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.007507 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.008908 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.011765 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.011938 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.012104 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xqzpg" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.012973 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.017201 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.058167 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.059071 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.062205 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dvd9l" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.062366 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.062526 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.067198 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.077947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078073 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lgr\" (UniqueName: \"kubernetes.io/projected/f3456852-8fb7-4e40-81d1-f3ba06088f81-kube-api-access-c5lgr\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078211 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078254 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078288 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078324 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.078356 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179127 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179178 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lgr\" (UniqueName: \"kubernetes.io/projected/f3456852-8fb7-4e40-81d1-f3ba06088f81-kube-api-access-c5lgr\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179247 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kolla-config\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179281 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179333 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179357 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179383 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179385 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klm8k\" (UniqueName: \"kubernetes.io/projected/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kube-api-access-klm8k\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.179948 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.180039 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-config-data\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.180097 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.180425 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.180472 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.180635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.182590 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.185440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.195075 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.196476 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lgr\" (UniqueName: \"kubernetes.io/projected/f3456852-8fb7-4e40-81d1-f3ba06088f81-kube-api-access-c5lgr\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.200558 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.282614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.282718 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kolla-config\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.282790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.282938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klm8k\" (UniqueName: \"kubernetes.io/projected/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kube-api-access-klm8k\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.283024 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-config-data\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.285250 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-config-data\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.286069 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kolla-config\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.288635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.290079 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.303239 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klm8k\" (UniqueName: \"kubernetes.io/projected/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kube-api-access-klm8k\") pod \"memcached-0\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " pod="openstack/memcached-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.367078 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:11 crc kubenswrapper[4921]: I0318 12:30:11.402083 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.440422 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.441704 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.449719 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8pgd9" Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.457967 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.517667 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlb6\" (UniqueName: \"kubernetes.io/projected/45759e25-d3df-4741-bbc3-4111118d3d1e-kube-api-access-7nlb6\") pod \"kube-state-metrics-0\" (UID: \"45759e25-d3df-4741-bbc3-4111118d3d1e\") " pod="openstack/kube-state-metrics-0" Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.619586 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlb6\" (UniqueName: \"kubernetes.io/projected/45759e25-d3df-4741-bbc3-4111118d3d1e-kube-api-access-7nlb6\") pod \"kube-state-metrics-0\" (UID: \"45759e25-d3df-4741-bbc3-4111118d3d1e\") " pod="openstack/kube-state-metrics-0" Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.644095 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlb6\" (UniqueName: \"kubernetes.io/projected/45759e25-d3df-4741-bbc3-4111118d3d1e-kube-api-access-7nlb6\") pod \"kube-state-metrics-0\" (UID: \"45759e25-d3df-4741-bbc3-4111118d3d1e\") " pod="openstack/kube-state-metrics-0" Mar 18 12:30:13 crc kubenswrapper[4921]: I0318 12:30:13.769576 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:30:15 crc kubenswrapper[4921]: I0318 12:30:15.783308 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef935990-b291-43b7-9d56-673b7b05a7a7","Type":"ContainerStarted","Data":"4622bc4fb13d840177305177d8e970d9bb86f4736ee8dba42a0d3595e9fb72be"} Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.428674 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djh4f"] Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.429945 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.432311 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-frlzn" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.432912 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.435553 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bg8nq"] Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.436454 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.437218 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.447248 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bg8nq"] Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.454171 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djh4f"] Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.570924 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76380191-f4a9-4690-bb6e-cb85ad794e33-scripts\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.570977 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v92nk\" (UniqueName: \"kubernetes.io/projected/76380191-f4a9-4690-bb6e-cb85ad794e33-kube-api-access-v92nk\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571005 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-combined-ca-bundle\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-ovn-controller-tls-certs\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571134 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-run\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571173 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-log\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run-ovn\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571220 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-lib\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571249 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571357 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-etc-ovs\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571412 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kmj\" (UniqueName: \"kubernetes.io/projected/3ee31803-52cb-4fcd-8ab1-990b0440a67a-kube-api-access-n2kmj\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571477 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-log-ovn\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.571558 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ee31803-52cb-4fcd-8ab1-990b0440a67a-scripts\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672377 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-run\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672434 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-log\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run-ovn\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672482 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-lib\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672508 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672578 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-etc-ovs\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672606 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kmj\" (UniqueName: \"kubernetes.io/projected/3ee31803-52cb-4fcd-8ab1-990b0440a67a-kube-api-access-n2kmj\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672631 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-log-ovn\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672653 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ee31803-52cb-4fcd-8ab1-990b0440a67a-scripts\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672678 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76380191-f4a9-4690-bb6e-cb85ad794e33-scripts\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672694 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v92nk\" (UniqueName: \"kubernetes.io/projected/76380191-f4a9-4690-bb6e-cb85ad794e33-kube-api-access-v92nk\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672707 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-combined-ca-bundle\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.672741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-ovn-controller-tls-certs\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.679746 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-lib\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.679916 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-etc-ovs\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.680409 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run-ovn\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.688532 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-run\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.688943 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-ovn-controller-tls-certs\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.688986 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-log-ovn\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.689462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-log\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.689554 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.692033 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76380191-f4a9-4690-bb6e-cb85ad794e33-scripts\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.702992 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ee31803-52cb-4fcd-8ab1-990b0440a67a-scripts\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.703954 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-combined-ca-bundle\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.705518 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kmj\" (UniqueName: \"kubernetes.io/projected/3ee31803-52cb-4fcd-8ab1-990b0440a67a-kube-api-access-n2kmj\") pod \"ovn-controller-djh4f\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.705813 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v92nk\" (UniqueName: \"kubernetes.io/projected/76380191-f4a9-4690-bb6e-cb85ad794e33-kube-api-access-v92nk\") pod \"ovn-controller-ovs-bg8nq\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.750546 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f" Mar 18 12:30:16 crc kubenswrapper[4921]: I0318 12:30:16.772685 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.176161 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.180380 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.182885 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-4k574" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.182998 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.183008 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.183020 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.183069 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.201513 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282318 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282391 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-config\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282425 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282442 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f884h\" (UniqueName: \"kubernetes.io/projected/abf100e1-595d-4dce-9125-c27db7e9408a-kube-api-access-f884h\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.282661 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386469 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f884h\" (UniqueName: \"kubernetes.io/projected/abf100e1-595d-4dce-9125-c27db7e9408a-kube-api-access-f884h\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386597 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386705 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386763 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386810 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-config\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386836 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.386858 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.387353 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.389018 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.392888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.393535 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-config\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.393869 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.405970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.406081 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.409129 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f884h\" (UniqueName: \"kubernetes.io/projected/abf100e1-595d-4dce-9125-c27db7e9408a-kube-api-access-f884h\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.413082 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:17 crc kubenswrapper[4921]: I0318 12:30:17.504318 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.613936 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.615902 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.617848 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.617917 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.621875 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-w2qvb" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.627859 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.633478 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.754885 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldnl6\" (UniqueName: \"kubernetes.io/projected/e1887f67-2204-45c0-be34-9471594f217c-kube-api-access-ldnl6\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-config\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755252 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755435 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755525 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755552 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1887f67-2204-45c0-be34-9471594f217c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.755576 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.856853 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857148 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857164 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1887f67-2204-45c0-be34-9471594f217c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857185 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857210 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857254 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldnl6\" (UniqueName: \"kubernetes.io/projected/e1887f67-2204-45c0-be34-9471594f217c-kube-api-access-ldnl6\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-config\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857551 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.857659 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1887f67-2204-45c0-be34-9471594f217c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.858715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-config\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.863271 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.877015 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.877721 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldnl6\" (UniqueName: \"kubernetes.io/projected/e1887f67-2204-45c0-be34-9471594f217c-kube-api-access-ldnl6\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.878650 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.888832 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.903755 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:20 crc kubenswrapper[4921]: I0318 12:30:20.950451 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.686679 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.687393 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2w79h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-fscjr_openstack(c564bb1e-efbd-4ad2-aa41-4f53055cef70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.688619 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" podUID="c564bb1e-efbd-4ad2-aa41-4f53055cef70" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.756173 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.756334 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpp8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fwlks_openstack(d1acabd5-815e-4bd6-949d-628c88f36edd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.757429 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" podUID="d1acabd5-815e-4bd6-949d-628c88f36edd" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.760146 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.760253 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c2cvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-gglnq_openstack(a3e572d1-e278-46cd-a29f-2592c8565a96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.761404 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" podUID="a3e572d1-e278-46cd-a29f-2592c8565a96" Mar 18 12:30:23 crc kubenswrapper[4921]: E0318 12:30:23.849639 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" podUID="c564bb1e-efbd-4ad2-aa41-4f53055cef70" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.025321 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.032011 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.133262 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1acabd5-815e-4bd6-949d-628c88f36edd-config\") pod \"d1acabd5-815e-4bd6-949d-628c88f36edd\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.133385 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-dns-svc\") pod \"a3e572d1-e278-46cd-a29f-2592c8565a96\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.133425 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpp8k\" (UniqueName: \"kubernetes.io/projected/d1acabd5-815e-4bd6-949d-628c88f36edd-kube-api-access-tpp8k\") pod \"d1acabd5-815e-4bd6-949d-628c88f36edd\" (UID: \"d1acabd5-815e-4bd6-949d-628c88f36edd\") " Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.133468 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2cvt\" (UniqueName: \"kubernetes.io/projected/a3e572d1-e278-46cd-a29f-2592c8565a96-kube-api-access-c2cvt\") pod \"a3e572d1-e278-46cd-a29f-2592c8565a96\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.133512 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-config\") pod \"a3e572d1-e278-46cd-a29f-2592c8565a96\" (UID: \"a3e572d1-e278-46cd-a29f-2592c8565a96\") " Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.134449 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-config" (OuterVolumeSpecName: "config") pod "a3e572d1-e278-46cd-a29f-2592c8565a96" (UID: "a3e572d1-e278-46cd-a29f-2592c8565a96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.134680 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1acabd5-815e-4bd6-949d-628c88f36edd-config" (OuterVolumeSpecName: "config") pod "d1acabd5-815e-4bd6-949d-628c88f36edd" (UID: "d1acabd5-815e-4bd6-949d-628c88f36edd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.135191 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3e572d1-e278-46cd-a29f-2592c8565a96" (UID: "a3e572d1-e278-46cd-a29f-2592c8565a96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.160346 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1acabd5-815e-4bd6-949d-628c88f36edd-kube-api-access-tpp8k" (OuterVolumeSpecName: "kube-api-access-tpp8k") pod "d1acabd5-815e-4bd6-949d-628c88f36edd" (UID: "d1acabd5-815e-4bd6-949d-628c88f36edd"). InnerVolumeSpecName "kube-api-access-tpp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.161596 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e572d1-e278-46cd-a29f-2592c8565a96-kube-api-access-c2cvt" (OuterVolumeSpecName: "kube-api-access-c2cvt") pod "a3e572d1-e278-46cd-a29f-2592c8565a96" (UID: "a3e572d1-e278-46cd-a29f-2592c8565a96"). InnerVolumeSpecName "kube-api-access-c2cvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.235089 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.235166 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpp8k\" (UniqueName: \"kubernetes.io/projected/d1acabd5-815e-4bd6-949d-628c88f36edd-kube-api-access-tpp8k\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.235182 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2cvt\" (UniqueName: \"kubernetes.io/projected/a3e572d1-e278-46cd-a29f-2592c8565a96-kube-api-access-c2cvt\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.235194 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3e572d1-e278-46cd-a29f-2592c8565a96-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.235206 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1acabd5-815e-4bd6-949d-628c88f36edd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.481274 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: W0318 12:30:25.483800 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3456852_8fb7_4e40_81d1_f3ba06088f81.slice/crio-6f79d6ec4502a0d208ebe7a83131b01164475c00630991f1a1ce404632962d1f WatchSource:0}: Error finding container 6f79d6ec4502a0d208ebe7a83131b01164475c00630991f1a1ce404632962d1f: Status 404 returned error can't find the container with id 6f79d6ec4502a0d208ebe7a83131b01164475c00630991f1a1ce404632962d1f Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.605103 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: W0318 12:30:25.606739 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf2f76d2_7d0e_450c_8218_0cf40e03cbee.slice/crio-75c695c0563603d0230e0dd6504fe9dc600caf3168e14edce241b97f2128826d WatchSource:0}: Error finding container 75c695c0563603d0230e0dd6504fe9dc600caf3168e14edce241b97f2128826d: Status 404 returned error can't find the container with id 75c695c0563603d0230e0dd6504fe9dc600caf3168e14edce241b97f2128826d Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.684136 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: W0318 12:30:25.684704 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6e4980_bd4a_455e_924b_739cee9587c9.slice/crio-e92da77e6fa4610ea197df65bd241bebface9949245ec4d188f3e927c00df125 WatchSource:0}: Error finding container e92da77e6fa4610ea197df65bd241bebface9949245ec4d188f3e927c00df125: Status 404 returned error can't find the container with id e92da77e6fa4610ea197df65bd241bebface9949245ec4d188f3e927c00df125 Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.692686 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: W0318 12:30:25.699104 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf692663_cc58_4cf1_a05b_566e0152ee90.slice/crio-d8ad9642d887a2d3381872d62b39b35378edf7a96826efe3b9e9327cfa22a4dc WatchSource:0}: Error finding container d8ad9642d887a2d3381872d62b39b35378edf7a96826efe3b9e9327cfa22a4dc: Status 404 returned error can't find the container with id d8ad9642d887a2d3381872d62b39b35378edf7a96826efe3b9e9327cfa22a4dc Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.804034 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.840407 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.853422 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djh4f"] Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.884384 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf2f76d2-7d0e-450c-8218-0cf40e03cbee","Type":"ContainerStarted","Data":"75c695c0563603d0230e0dd6504fe9dc600caf3168e14edce241b97f2128826d"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.888153 4921 generic.go:334] "Generic (PLEG): container finished" podID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerID="3508bb686babe7a6f0e49ba27aef0d00dcfce1ee6ff5588ed84d3b681ae3dcce" exitCode=0 Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.888234 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" event={"ID":"c73a721b-d0ec-48bb-8106-c35c9d1d3466","Type":"ContainerDied","Data":"3508bb686babe7a6f0e49ba27aef0d00dcfce1ee6ff5588ed84d3b681ae3dcce"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.892599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df692663-cc58-4cf1-a05b-566e0152ee90","Type":"ContainerStarted","Data":"d8ad9642d887a2d3381872d62b39b35378edf7a96826efe3b9e9327cfa22a4dc"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.897424 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.897491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fwlks" event={"ID":"d1acabd5-815e-4bd6-949d-628c88f36edd","Type":"ContainerDied","Data":"c222e9dfa95317c539ecdad1ed23759a165d53a4b72c2a26aabd5575997818ca"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.900029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb6e4980-bd4a-455e-924b-739cee9587c9","Type":"ContainerStarted","Data":"e92da77e6fa4610ea197df65bd241bebface9949245ec4d188f3e927c00df125"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.901847 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f3456852-8fb7-4e40-81d1-f3ba06088f81","Type":"ContainerStarted","Data":"6f79d6ec4502a0d208ebe7a83131b01164475c00630991f1a1ce404632962d1f"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.904746 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" event={"ID":"a3e572d1-e278-46cd-a29f-2592c8565a96","Type":"ContainerDied","Data":"0982262d26380a8207dbc5e7447768facc373dd92319697da900f20e8a390870"} Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.904850 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gglnq" Mar 18 12:30:25 crc kubenswrapper[4921]: I0318 12:30:25.953616 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:30:25 crc kubenswrapper[4921]: W0318 12:30:25.955615 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1887f67_2204_45c0_be34_9471594f217c.slice/crio-0b397ad591533ebe1ec56706eddfe15e2df40f430c1302166d0264f1e64f0548 WatchSource:0}: Error finding container 0b397ad591533ebe1ec56706eddfe15e2df40f430c1302166d0264f1e64f0548: Status 404 returned error can't find the container with id 0b397ad591533ebe1ec56706eddfe15e2df40f430c1302166d0264f1e64f0548 Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.146727 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gglnq"] Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.153505 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gglnq"] Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.177009 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fwlks"] Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.183762 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fwlks"] Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.785103 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bg8nq"] Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.916213 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45759e25-d3df-4741-bbc3-4111118d3d1e","Type":"ContainerStarted","Data":"93a405b0ea0f31e1007feb75e7bc5b4c579bb3ab82c2fcf35a5d34a003b6ba79"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.917742 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef935990-b291-43b7-9d56-673b7b05a7a7","Type":"ContainerStarted","Data":"a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.920101 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf100e1-595d-4dce-9125-c27db7e9408a","Type":"ContainerStarted","Data":"fb29aa2ecd303f3ad9fa4bb4e91b68eff97431f3aadc4531a48c4ee4624b1a8d"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.923034 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" event={"ID":"c73a721b-d0ec-48bb-8106-c35c9d1d3466","Type":"ContainerStarted","Data":"fa2c32b2c306a96413d59bd0681650cb0a0da204cbf58820194a33330fabaa55"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.923127 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.924106 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f" event={"ID":"3ee31803-52cb-4fcd-8ab1-990b0440a67a","Type":"ContainerStarted","Data":"9bf619b0c98e4a96d465d46d66874a7f13971b32f242ce3c41cbaf1d94033161"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.926827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df692663-cc58-4cf1-a05b-566e0152ee90","Type":"ContainerStarted","Data":"dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.928687 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1887f67-2204-45c0-be34-9471594f217c","Type":"ContainerStarted","Data":"0b397ad591533ebe1ec56706eddfe15e2df40f430c1302166d0264f1e64f0548"} Mar 18 12:30:26 crc kubenswrapper[4921]: I0318 12:30:26.965491 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" podStartSLOduration=3.677094054 podStartE2EDuration="19.965470462s" podCreationTimestamp="2026-03-18 12:30:07 +0000 UTC" firstStartedPulling="2026-03-18 12:30:08.60871586 +0000 UTC m=+1228.158636499" lastFinishedPulling="2026-03-18 12:30:24.897092258 +0000 UTC m=+1244.447012907" observedRunningTime="2026-03-18 12:30:26.964630699 +0000 UTC m=+1246.514551338" watchObservedRunningTime="2026-03-18 12:30:26.965470462 +0000 UTC m=+1246.515391101" Mar 18 12:30:27 crc kubenswrapper[4921]: W0318 12:30:27.044027 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76380191_f4a9_4690_bb6e_cb85ad794e33.slice/crio-b2567843608c14304b7309cb15eb4c236be3e49a9f1421e9d1c36309289091e1 WatchSource:0}: Error finding container b2567843608c14304b7309cb15eb4c236be3e49a9f1421e9d1c36309289091e1: Status 404 returned error can't find the container with id b2567843608c14304b7309cb15eb4c236be3e49a9f1421e9d1c36309289091e1 Mar 18 12:30:27 crc kubenswrapper[4921]: I0318 12:30:27.224540 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e572d1-e278-46cd-a29f-2592c8565a96" path="/var/lib/kubelet/pods/a3e572d1-e278-46cd-a29f-2592c8565a96/volumes" Mar 18 12:30:27 crc kubenswrapper[4921]: I0318 12:30:27.225373 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1acabd5-815e-4bd6-949d-628c88f36edd" path="/var/lib/kubelet/pods/d1acabd5-815e-4bd6-949d-628c88f36edd/volumes" Mar 18 12:30:27 crc kubenswrapper[4921]: I0318 12:30:27.938679 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerStarted","Data":"b2567843608c14304b7309cb15eb4c236be3e49a9f1421e9d1c36309289091e1"} Mar 18 12:30:32 crc kubenswrapper[4921]: I0318 12:30:32.986078 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.046633 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fscjr"] Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.684925 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.789646 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w79h\" (UniqueName: \"kubernetes.io/projected/c564bb1e-efbd-4ad2-aa41-4f53055cef70-kube-api-access-2w79h\") pod \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.790094 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-dns-svc\") pod \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.790211 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-config\") pod \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\" (UID: \"c564bb1e-efbd-4ad2-aa41-4f53055cef70\") " Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.790575 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c564bb1e-efbd-4ad2-aa41-4f53055cef70" (UID: "c564bb1e-efbd-4ad2-aa41-4f53055cef70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.790855 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-config" (OuterVolumeSpecName: "config") pod "c564bb1e-efbd-4ad2-aa41-4f53055cef70" (UID: "c564bb1e-efbd-4ad2-aa41-4f53055cef70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.797950 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c564bb1e-efbd-4ad2-aa41-4f53055cef70-kube-api-access-2w79h" (OuterVolumeSpecName: "kube-api-access-2w79h") pod "c564bb1e-efbd-4ad2-aa41-4f53055cef70" (UID: "c564bb1e-efbd-4ad2-aa41-4f53055cef70"). InnerVolumeSpecName "kube-api-access-2w79h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.892103 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.892208 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c564bb1e-efbd-4ad2-aa41-4f53055cef70-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:33 crc kubenswrapper[4921]: I0318 12:30:33.892221 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w79h\" (UniqueName: \"kubernetes.io/projected/c564bb1e-efbd-4ad2-aa41-4f53055cef70-kube-api-access-2w79h\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:33.999363 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf2f76d2-7d0e-450c-8218-0cf40e03cbee","Type":"ContainerStarted","Data":"d1d72c66ce5a1eb5ac5faf2d0a921bafbdff4044fe05e8e42218253919f68d86"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:33.999928 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.001154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf100e1-595d-4dce-9125-c27db7e9408a","Type":"ContainerStarted","Data":"e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.005181 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerStarted","Data":"12a442cf99669b3576078f8841bcba37f822c4c9adab4bd21d43b92d7f71b599"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.007282 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f" event={"ID":"3ee31803-52cb-4fcd-8ab1-990b0440a67a","Type":"ContainerStarted","Data":"91ea66d0192bbffc94a048cad7e99c8827ba926ea7b1b2762795c9d6c5b354a9"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.007459 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-djh4f" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.008999 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb6e4980-bd4a-455e-924b-739cee9587c9","Type":"ContainerStarted","Data":"ee45e93de09bd0d89582c74ef72aeca3a1f8a1f9676053db278804ba3c1bdc92"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.012020 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45759e25-d3df-4741-bbc3-4111118d3d1e","Type":"ContainerStarted","Data":"0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.012058 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.013682 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f3456852-8fb7-4e40-81d1-f3ba06088f81","Type":"ContainerStarted","Data":"e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.015933 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.015928 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-fscjr" event={"ID":"c564bb1e-efbd-4ad2-aa41-4f53055cef70","Type":"ContainerDied","Data":"f25cb37d335df5c03ff7a4d87b8b7ab6913e79178f1926ca6d7fb7004680d309"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.017586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1887f67-2204-45c0-be34-9471594f217c","Type":"ContainerStarted","Data":"4eb6151343cc2ec88716d0980c3d56c2d49d23288a278634cdfaa260afdf18bd"} Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.027723 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.256568856 podStartE2EDuration="23.027686148s" podCreationTimestamp="2026-03-18 12:30:11 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.610316187 +0000 UTC m=+1245.160236826" lastFinishedPulling="2026-03-18 12:30:32.381433479 +0000 UTC m=+1251.931354118" observedRunningTime="2026-03-18 12:30:34.02248268 +0000 UTC m=+1253.572403319" watchObservedRunningTime="2026-03-18 12:30:34.027686148 +0000 UTC m=+1253.577606787" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.195969 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-djh4f" podStartSLOduration=11.361387765 podStartE2EDuration="18.19595519s" podCreationTimestamp="2026-03-18 12:30:16 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.886808786 +0000 UTC m=+1245.436729425" lastFinishedPulling="2026-03-18 12:30:32.721376211 +0000 UTC m=+1252.271296850" observedRunningTime="2026-03-18 12:30:34.194047786 +0000 UTC m=+1253.743968425" watchObservedRunningTime="2026-03-18 12:30:34.19595519 +0000 UTC m=+1253.745875829" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.226378 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.78874386 podStartE2EDuration="21.226362294s" podCreationTimestamp="2026-03-18 12:30:13 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.883049659 +0000 UTC m=+1245.432970298" lastFinishedPulling="2026-03-18 12:30:33.320668083 +0000 UTC m=+1252.870588732" observedRunningTime="2026-03-18 12:30:34.224503611 +0000 UTC m=+1253.774424250" watchObservedRunningTime="2026-03-18 12:30:34.226362294 +0000 UTC m=+1253.776282933" Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.265793 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fscjr"] Mar 18 12:30:34 crc kubenswrapper[4921]: I0318 12:30:34.273490 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fscjr"] Mar 18 12:30:35 crc kubenswrapper[4921]: I0318 12:30:35.030397 4921 generic.go:334] "Generic (PLEG): container finished" podID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerID="12a442cf99669b3576078f8841bcba37f822c4c9adab4bd21d43b92d7f71b599" exitCode=0 Mar 18 12:30:35 crc kubenswrapper[4921]: I0318 12:30:35.030455 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerDied","Data":"12a442cf99669b3576078f8841bcba37f822c4c9adab4bd21d43b92d7f71b599"} Mar 18 12:30:35 crc kubenswrapper[4921]: I0318 12:30:35.219329 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c564bb1e-efbd-4ad2-aa41-4f53055cef70" path="/var/lib/kubelet/pods/c564bb1e-efbd-4ad2-aa41-4f53055cef70/volumes" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.061787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf100e1-595d-4dce-9125-c27db7e9408a","Type":"ContainerStarted","Data":"208313c91671ca6bab23b1fd323268c213a12034b1cf7b7e4deee0560bc9b81b"} Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.064277 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerStarted","Data":"8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1"} Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.064354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerStarted","Data":"78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b"} Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.064411 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.064492 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.068231 4921 generic.go:334] "Generic (PLEG): container finished" podID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerID="ee45e93de09bd0d89582c74ef72aeca3a1f8a1f9676053db278804ba3c1bdc92" exitCode=0 Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.068303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb6e4980-bd4a-455e-924b-739cee9587c9","Type":"ContainerDied","Data":"ee45e93de09bd0d89582c74ef72aeca3a1f8a1f9676053db278804ba3c1bdc92"} Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.070676 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1887f67-2204-45c0-be34-9471594f217c","Type":"ContainerStarted","Data":"84aeae41e48cf9f3868fdc45a2a0e25ada202b804f77350b8ae8019f82805fc0"} Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.071900 4921 generic.go:334] "Generic (PLEG): container finished" podID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerID="e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d" exitCode=0 Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.071930 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f3456852-8fb7-4e40-81d1-f3ba06088f81","Type":"ContainerDied","Data":"e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d"} Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.086952 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.063370305 podStartE2EDuration="22.086933926s" podCreationTimestamp="2026-03-18 12:30:16 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.883019268 +0000 UTC m=+1245.432939907" lastFinishedPulling="2026-03-18 12:30:36.906582889 +0000 UTC m=+1256.456503528" observedRunningTime="2026-03-18 12:30:38.08076496 +0000 UTC m=+1257.630685599" watchObservedRunningTime="2026-03-18 12:30:38.086933926 +0000 UTC m=+1257.636854555" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.110229 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bg8nq" podStartSLOduration=16.494445412 podStartE2EDuration="22.110207457s" podCreationTimestamp="2026-03-18 12:30:16 +0000 UTC" firstStartedPulling="2026-03-18 12:30:27.046904937 +0000 UTC m=+1246.596825576" lastFinishedPulling="2026-03-18 12:30:32.662666982 +0000 UTC m=+1252.212587621" observedRunningTime="2026-03-18 12:30:38.102491808 +0000 UTC m=+1257.652412447" watchObservedRunningTime="2026-03-18 12:30:38.110207457 +0000 UTC m=+1257.660128116" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.127479 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.197992852 podStartE2EDuration="19.127462708s" podCreationTimestamp="2026-03-18 12:30:19 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.958099242 +0000 UTC m=+1245.508019881" lastFinishedPulling="2026-03-18 12:30:36.887569108 +0000 UTC m=+1256.437489737" observedRunningTime="2026-03-18 12:30:38.123020291 +0000 UTC m=+1257.672940970" watchObservedRunningTime="2026-03-18 12:30:38.127462708 +0000 UTC m=+1257.677383347" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.505231 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.545528 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.951186 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:38 crc kubenswrapper[4921]: I0318 12:30:38.986574 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.096977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb6e4980-bd4a-455e-924b-739cee9587c9","Type":"ContainerStarted","Data":"df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff"} Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.100181 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f3456852-8fb7-4e40-81d1-f3ba06088f81","Type":"ContainerStarted","Data":"40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277"} Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.100816 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.100843 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.120164 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.665664137 podStartE2EDuration="31.120141541s" podCreationTimestamp="2026-03-18 12:30:08 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.686936795 +0000 UTC m=+1245.236857434" lastFinishedPulling="2026-03-18 12:30:33.141414199 +0000 UTC m=+1252.691334838" observedRunningTime="2026-03-18 12:30:39.114665915 +0000 UTC m=+1258.664586554" watchObservedRunningTime="2026-03-18 12:30:39.120141541 +0000 UTC m=+1258.670062180" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.134794 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.033046119 podStartE2EDuration="30.134776817s" podCreationTimestamp="2026-03-18 12:30:09 +0000 UTC" firstStartedPulling="2026-03-18 12:30:25.485962443 +0000 UTC m=+1245.035883082" lastFinishedPulling="2026-03-18 12:30:32.587693131 +0000 UTC m=+1252.137613780" observedRunningTime="2026-03-18 12:30:39.133410958 +0000 UTC m=+1258.683331607" watchObservedRunningTime="2026-03-18 12:30:39.134776817 +0000 UTC m=+1258.684697466" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.140402 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.144915 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.390043 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-wm6zk"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.398663 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.401500 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.426072 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-wm6zk"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.441315 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-p6tzr"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.442538 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.448085 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.490709 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p6tzr"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.503977 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-combined-ca-bundle\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504090 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504142 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bedf417-75a0-4163-88ee-c11ea02ae1f4-config\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504166 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxbh\" (UniqueName: \"kubernetes.io/projected/4bedf417-75a0-4163-88ee-c11ea02ae1f4-kube-api-access-njxbh\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504184 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-config\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504229 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovn-rundir\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504299 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdf4\" (UniqueName: \"kubernetes.io/projected/3335ff88-9f6c-47d6-96ba-57e18def2eb3-kube-api-access-fpdf4\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.504326 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovs-rundir\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.605543 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdf4\" (UniqueName: \"kubernetes.io/projected/3335ff88-9f6c-47d6-96ba-57e18def2eb3-kube-api-access-fpdf4\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.605904 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovs-rundir\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606086 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606232 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606347 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-combined-ca-bundle\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606479 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bedf417-75a0-4163-88ee-c11ea02ae1f4-config\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njxbh\" (UniqueName: \"kubernetes.io/projected/4bedf417-75a0-4163-88ee-c11ea02ae1f4-kube-api-access-njxbh\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606362 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovs-rundir\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.606860 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-config\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.607104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovn-rundir\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.607299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovn-rundir\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.607513 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.607728 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.608441 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-config\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.608511 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bedf417-75a0-4163-88ee-c11ea02ae1f4-config\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.616327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-combined-ca-bundle\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.629701 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.644010 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-wm6zk"] Mar 18 12:30:39 crc kubenswrapper[4921]: E0318 12:30:39.644649 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-fpdf4], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" podUID="3335ff88-9f6c-47d6-96ba-57e18def2eb3" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.665104 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdf4\" (UniqueName: \"kubernetes.io/projected/3335ff88-9f6c-47d6-96ba-57e18def2eb3-kube-api-access-fpdf4\") pod \"dnsmasq-dns-7f896c8c65-wm6zk\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.685041 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxbh\" (UniqueName: \"kubernetes.io/projected/4bedf417-75a0-4163-88ee-c11ea02ae1f4-kube-api-access-njxbh\") pod \"ovn-controller-metrics-p6tzr\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.708614 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.710874 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.733344 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.733572 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.733724 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.733943 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4bldq" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.764624 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.767686 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.812992 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-scripts\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.813805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.813939 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-config\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.814053 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.814265 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.814439 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.814588 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnm5h\" (UniqueName: \"kubernetes.io/projected/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-kube-api-access-tnm5h\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.839917 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b2jnk"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.841701 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.849262 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.863308 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b2jnk"] Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916158 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916810 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916853 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916888 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnm5h\" (UniqueName: \"kubernetes.io/projected/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-kube-api-access-tnm5h\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916904 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dklv\" (UniqueName: \"kubernetes.io/projected/1e227090-452d-4ca2-8b06-13327a1aa4f8-kube-api-access-8dklv\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-scripts\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916954 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.916979 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.917009 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-config\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.917033 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.917090 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.917132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-config\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.919053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-scripts\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.919503 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-config\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.919776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.921369 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.922043 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.935437 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.953566 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnm5h\" (UniqueName: \"kubernetes.io/projected/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-kube-api-access-tnm5h\") pod \"ovn-northd-0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " pod="openstack/ovn-northd-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.994579 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 12:30:39 crc kubenswrapper[4921]: I0318 12:30:39.994617 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.018401 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.018490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dklv\" (UniqueName: \"kubernetes.io/projected/1e227090-452d-4ca2-8b06-13327a1aa4f8-kube-api-access-8dklv\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.018531 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.018583 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.018607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-config\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.019502 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-config\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.019559 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.019625 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.020390 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.039450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dklv\" (UniqueName: \"kubernetes.io/projected/1e227090-452d-4ca2-8b06-13327a1aa4f8-kube-api-access-8dklv\") pod \"dnsmasq-dns-86db49b7ff-b2jnk\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.053773 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.125979 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.139511 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.196984 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.319383 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p6tzr"] Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.323881 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-ovsdbserver-sb\") pod \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.323944 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-dns-svc\") pod \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.324046 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpdf4\" (UniqueName: \"kubernetes.io/projected/3335ff88-9f6c-47d6-96ba-57e18def2eb3-kube-api-access-fpdf4\") pod \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.324207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-config\") pod \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\" (UID: \"3335ff88-9f6c-47d6-96ba-57e18def2eb3\") " Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.324704 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-config" (OuterVolumeSpecName: "config") pod "3335ff88-9f6c-47d6-96ba-57e18def2eb3" (UID: "3335ff88-9f6c-47d6-96ba-57e18def2eb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.324720 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3335ff88-9f6c-47d6-96ba-57e18def2eb3" (UID: "3335ff88-9f6c-47d6-96ba-57e18def2eb3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.324754 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3335ff88-9f6c-47d6-96ba-57e18def2eb3" (UID: "3335ff88-9f6c-47d6-96ba-57e18def2eb3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.325620 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.325640 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.325652 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3335ff88-9f6c-47d6-96ba-57e18def2eb3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.329022 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3335ff88-9f6c-47d6-96ba-57e18def2eb3-kube-api-access-fpdf4" (OuterVolumeSpecName: "kube-api-access-fpdf4") pod "3335ff88-9f6c-47d6-96ba-57e18def2eb3" (UID: "3335ff88-9f6c-47d6-96ba-57e18def2eb3"). InnerVolumeSpecName "kube-api-access-fpdf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:40 crc kubenswrapper[4921]: W0318 12:30:40.333898 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bedf417_75a0_4163_88ee_c11ea02ae1f4.slice/crio-b132a8633b6483dfdf7634ec61b2e3ac778375792c88eaea256d55b8ffed8e4e WatchSource:0}: Error finding container b132a8633b6483dfdf7634ec61b2e3ac778375792c88eaea256d55b8ffed8e4e: Status 404 returned error can't find the container with id b132a8633b6483dfdf7634ec61b2e3ac778375792c88eaea256d55b8ffed8e4e Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.427503 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpdf4\" (UniqueName: \"kubernetes.io/projected/3335ff88-9f6c-47d6-96ba-57e18def2eb3-kube-api-access-fpdf4\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.595846 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:30:40 crc kubenswrapper[4921]: W0318 12:30:40.607003 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef9b9cb_5b95_4fec_b2c7_905ddb5aebe0.slice/crio-5c457b678e38dcb5ff2c47a0a385a8fd32f276bf88160bc492c81ef53aca628e WatchSource:0}: Error finding container 5c457b678e38dcb5ff2c47a0a385a8fd32f276bf88160bc492c81ef53aca628e: Status 404 returned error can't find the container with id 5c457b678e38dcb5ff2c47a0a385a8fd32f276bf88160bc492c81ef53aca628e Mar 18 12:30:40 crc kubenswrapper[4921]: I0318 12:30:40.726966 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b2jnk"] Mar 18 12:30:40 crc kubenswrapper[4921]: W0318 12:30:40.740011 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e227090_452d_4ca2_8b06_13327a1aa4f8.slice/crio-9e5d801d9c3464e9c6909a64ee4515cd5b95daf8bd4c9bca9c8b5a97fa408749 WatchSource:0}: Error finding container 9e5d801d9c3464e9c6909a64ee4515cd5b95daf8bd4c9bca9c8b5a97fa408749: Status 404 returned error can't find the container with id 9e5d801d9c3464e9c6909a64ee4515cd5b95daf8bd4c9bca9c8b5a97fa408749 Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.135402 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p6tzr" event={"ID":"4bedf417-75a0-4163-88ee-c11ea02ae1f4","Type":"ContainerStarted","Data":"f8e01516efcfac1821ca3c35e166cc122d12a03ad3ebfd843f89a0549e5ee5c2"} Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.135768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p6tzr" event={"ID":"4bedf417-75a0-4163-88ee-c11ea02ae1f4","Type":"ContainerStarted","Data":"b132a8633b6483dfdf7634ec61b2e3ac778375792c88eaea256d55b8ffed8e4e"} Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.140649 4921 generic.go:334] "Generic (PLEG): container finished" podID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerID="e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4" exitCode=0 Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.140719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" event={"ID":"1e227090-452d-4ca2-8b06-13327a1aa4f8","Type":"ContainerDied","Data":"e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4"} Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.140772 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" event={"ID":"1e227090-452d-4ca2-8b06-13327a1aa4f8","Type":"ContainerStarted","Data":"9e5d801d9c3464e9c6909a64ee4515cd5b95daf8bd4c9bca9c8b5a97fa408749"} Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.155328 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-p6tzr" podStartSLOduration=2.155312902 podStartE2EDuration="2.155312902s" podCreationTimestamp="2026-03-18 12:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:30:41.151753821 +0000 UTC m=+1260.701674460" watchObservedRunningTime="2026-03-18 12:30:41.155312902 +0000 UTC m=+1260.705233541" Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.180333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0","Type":"ContainerStarted","Data":"5c457b678e38dcb5ff2c47a0a385a8fd32f276bf88160bc492c81ef53aca628e"} Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.181266 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-wm6zk" Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.269749 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-wm6zk"] Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.279499 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-wm6zk"] Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.368529 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.368573 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:41 crc kubenswrapper[4921]: I0318 12:30:41.403184 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.187376 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" event={"ID":"1e227090-452d-4ca2-8b06-13327a1aa4f8","Type":"ContainerStarted","Data":"b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a"} Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.187760 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.192683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0","Type":"ContainerStarted","Data":"487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4"} Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.192722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0","Type":"ContainerStarted","Data":"513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3"} Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.192740 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.244246 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" podStartSLOduration=3.24421603 podStartE2EDuration="3.24421603s" podCreationTimestamp="2026-03-18 12:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:30:42.242315696 +0000 UTC m=+1261.792236335" watchObservedRunningTime="2026-03-18 12:30:42.24421603 +0000 UTC m=+1261.794136669" Mar 18 12:30:42 crc kubenswrapper[4921]: I0318 12:30:42.281309 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.115871651 podStartE2EDuration="3.281292434s" podCreationTimestamp="2026-03-18 12:30:39 +0000 UTC" firstStartedPulling="2026-03-18 12:30:40.609928961 +0000 UTC m=+1260.159849600" lastFinishedPulling="2026-03-18 12:30:41.775349744 +0000 UTC m=+1261.325270383" observedRunningTime="2026-03-18 12:30:42.276260561 +0000 UTC m=+1261.826181210" watchObservedRunningTime="2026-03-18 12:30:42.281292434 +0000 UTC m=+1261.831213063" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.218823 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3335ff88-9f6c-47d6-96ba-57e18def2eb3" path="/var/lib/kubelet/pods/3335ff88-9f6c-47d6-96ba-57e18def2eb3/volumes" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.724785 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b2jnk"] Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.751788 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gkwk7"] Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.753433 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.767073 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gkwk7"] Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.776768 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.895409 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6b9\" (UniqueName: \"kubernetes.io/projected/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-kube-api-access-8x6b9\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.895477 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-dns-svc\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.895529 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-config\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.895547 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.895854 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.972569 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.997678 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6b9\" (UniqueName: \"kubernetes.io/projected/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-kube-api-access-8x6b9\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.997737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-dns-svc\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.997773 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-config\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.997790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.997835 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.998850 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-dns-svc\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.998857 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-config\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.998899 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:43 crc kubenswrapper[4921]: I0318 12:30:43.998963 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.020077 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6b9\" (UniqueName: \"kubernetes.io/projected/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-kube-api-access-8x6b9\") pod \"dnsmasq-dns-698758b865-gkwk7\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.046609 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.075913 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.207892 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerName="dnsmasq-dns" containerID="cri-o://b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a" gracePeriod=10 Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.551339 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gkwk7"] Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.631482 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.710437 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-sb\") pod \"1e227090-452d-4ca2-8b06-13327a1aa4f8\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.710544 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-dns-svc\") pod \"1e227090-452d-4ca2-8b06-13327a1aa4f8\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.710597 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-config\") pod \"1e227090-452d-4ca2-8b06-13327a1aa4f8\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.710677 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-nb\") pod \"1e227090-452d-4ca2-8b06-13327a1aa4f8\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.710722 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dklv\" (UniqueName: \"kubernetes.io/projected/1e227090-452d-4ca2-8b06-13327a1aa4f8-kube-api-access-8dklv\") pod \"1e227090-452d-4ca2-8b06-13327a1aa4f8\" (UID: \"1e227090-452d-4ca2-8b06-13327a1aa4f8\") " Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.718083 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e227090-452d-4ca2-8b06-13327a1aa4f8-kube-api-access-8dklv" (OuterVolumeSpecName: "kube-api-access-8dklv") pod "1e227090-452d-4ca2-8b06-13327a1aa4f8" (UID: "1e227090-452d-4ca2-8b06-13327a1aa4f8"). InnerVolumeSpecName "kube-api-access-8dklv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.761837 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e227090-452d-4ca2-8b06-13327a1aa4f8" (UID: "1e227090-452d-4ca2-8b06-13327a1aa4f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.763282 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e227090-452d-4ca2-8b06-13327a1aa4f8" (UID: "1e227090-452d-4ca2-8b06-13327a1aa4f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.773715 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-config" (OuterVolumeSpecName: "config") pod "1e227090-452d-4ca2-8b06-13327a1aa4f8" (UID: "1e227090-452d-4ca2-8b06-13327a1aa4f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.781842 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e227090-452d-4ca2-8b06-13327a1aa4f8" (UID: "1e227090-452d-4ca2-8b06-13327a1aa4f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.813089 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.813146 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.813157 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.813173 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dklv\" (UniqueName: \"kubernetes.io/projected/1e227090-452d-4ca2-8b06-13327a1aa4f8-kube-api-access-8dklv\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.813183 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e227090-452d-4ca2-8b06-13327a1aa4f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.886193 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:30:44 crc kubenswrapper[4921]: E0318 12:30:44.886574 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerName="dnsmasq-dns" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.886593 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerName="dnsmasq-dns" Mar 18 12:30:44 crc kubenswrapper[4921]: E0318 12:30:44.886654 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerName="init" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.886663 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerName="init" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.886864 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerName="dnsmasq-dns" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.896841 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.901808 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-zkvtc" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.901956 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.902094 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.902196 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 12:30:44 crc kubenswrapper[4921]: I0318 12:30:44.920533 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.022649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-lock\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.022743 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2hwm\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-kube-api-access-q2hwm\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.022793 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204df50-7907-4d3b-a8b3-5aee222044f2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.022884 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.022913 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.022987 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-cache\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.125285 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.125334 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.125405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-cache\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.125441 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-lock\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.125468 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2hwm\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-kube-api-access-q2hwm\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.125497 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204df50-7907-4d3b-a8b3-5aee222044f2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.126574 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-cache\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.126923 4921 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.126953 4921 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.127004 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift podName:2204df50-7907-4d3b-a8b3-5aee222044f2 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:45.626985291 +0000 UTC m=+1265.176905920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift") pod "swift-storage-0" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2") : configmap "swift-ring-files" not found Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.126925 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.127440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-lock\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.131712 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204df50-7907-4d3b-a8b3-5aee222044f2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.158867 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2hwm\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-kube-api-access-q2hwm\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.172351 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.219548 4921 generic.go:334] "Generic (PLEG): container finished" podID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerID="3bc6b45b28d208ccd35f4737c3c49593f8e255a8ff5900ff124377ea2b972816" exitCode=0 Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.219653 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gkwk7" event={"ID":"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938","Type":"ContainerDied","Data":"3bc6b45b28d208ccd35f4737c3c49593f8e255a8ff5900ff124377ea2b972816"} Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.219690 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gkwk7" event={"ID":"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938","Type":"ContainerStarted","Data":"b29b1b29b8e30e1739d13e2af677d49561f637553fecce3037de35d4fb297e70"} Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.222707 4921 generic.go:334] "Generic (PLEG): container finished" podID="1e227090-452d-4ca2-8b06-13327a1aa4f8" containerID="b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a" exitCode=0 Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.222741 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" event={"ID":"1e227090-452d-4ca2-8b06-13327a1aa4f8","Type":"ContainerDied","Data":"b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a"} Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.222763 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" event={"ID":"1e227090-452d-4ca2-8b06-13327a1aa4f8","Type":"ContainerDied","Data":"9e5d801d9c3464e9c6909a64ee4515cd5b95daf8bd4c9bca9c8b5a97fa408749"} Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.222779 4921 scope.go:117] "RemoveContainer" containerID="b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.222883 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-b2jnk" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.305346 4921 scope.go:117] "RemoveContainer" containerID="e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.309909 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b2jnk"] Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.316670 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-b2jnk"] Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.350755 4921 scope.go:117] "RemoveContainer" containerID="b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a" Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.351320 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a\": container with ID starting with b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a not found: ID does not exist" containerID="b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.351355 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a"} err="failed to get container status \"b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a\": rpc error: code = NotFound desc = could not find container \"b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a\": container with ID starting with b9ee09ff440aa364e82862b56bcc77e1596896c6d2766b09a65ead1c594a219a not found: ID does not exist" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.351377 4921 scope.go:117] "RemoveContainer" containerID="e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4" Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.351698 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4\": container with ID starting with e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4 not found: ID does not exist" containerID="e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.351757 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4"} err="failed to get container status \"e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4\": rpc error: code = NotFound desc = could not find container \"e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4\": container with ID starting with e9a38227fb0a19d9638ee4719a7352b385832760f5cbac962aaab0f3e9f97ac4 not found: ID does not exist" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.376507 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j94bd"] Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.377929 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.385194 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.385419 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.385574 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.426363 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j94bd"] Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432041 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-ring-data-devices\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432082 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6989392-6285-4c0d-80e4-3d2d30461e4f-etc-swift\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432106 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8xl\" (UniqueName: \"kubernetes.io/projected/f6989392-6285-4c0d-80e4-3d2d30461e4f-kube-api-access-jc8xl\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-swiftconf\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-scripts\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-dispersionconf\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.432351 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-combined-ca-bundle\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.533827 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6989392-6285-4c0d-80e4-3d2d30461e4f-etc-swift\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534207 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8xl\" (UniqueName: \"kubernetes.io/projected/f6989392-6285-4c0d-80e4-3d2d30461e4f-kube-api-access-jc8xl\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534325 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-swiftconf\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534355 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-scripts\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534410 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-dispersionconf\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6989392-6285-4c0d-80e4-3d2d30461e4f-etc-swift\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-combined-ca-bundle\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.534505 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-ring-data-devices\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.535281 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-scripts\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.535337 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-ring-data-devices\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.539982 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-swiftconf\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.541724 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-combined-ca-bundle\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.545104 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-dispersionconf\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.551720 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8xl\" (UniqueName: \"kubernetes.io/projected/f6989392-6285-4c0d-80e4-3d2d30461e4f-kube-api-access-jc8xl\") pod \"swift-ring-rebalance-j94bd\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.636228 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.636422 4921 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.636455 4921 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:30:45 crc kubenswrapper[4921]: E0318 12:30:45.636503 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift podName:2204df50-7907-4d3b-a8b3-5aee222044f2 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:46.636485891 +0000 UTC m=+1266.186406530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift") pod "swift-storage-0" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2") : configmap "swift-ring-files" not found Mar 18 12:30:45 crc kubenswrapper[4921]: I0318 12:30:45.726083 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.020208 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j94bd"] Mar 18 12:30:46 crc kubenswrapper[4921]: W0318 12:30:46.035303 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6989392_6285_4c0d_80e4_3d2d30461e4f.slice/crio-2b5e26a5ee5fb99033014633f25111f23cd46667dda55f0264924d7cdb35f340 WatchSource:0}: Error finding container 2b5e26a5ee5fb99033014633f25111f23cd46667dda55f0264924d7cdb35f340: Status 404 returned error can't find the container with id 2b5e26a5ee5fb99033014633f25111f23cd46667dda55f0264924d7cdb35f340 Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.132918 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.233782 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.235739 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gkwk7" event={"ID":"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938","Type":"ContainerStarted","Data":"41fa41eca02ba769262b6d0dbf93db9fda957aa59e98c9cf862735f757d8a357"} Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.235849 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.237074 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j94bd" event={"ID":"f6989392-6285-4c0d-80e4-3d2d30461e4f","Type":"ContainerStarted","Data":"2b5e26a5ee5fb99033014633f25111f23cd46667dda55f0264924d7cdb35f340"} Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.288221 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gkwk7" podStartSLOduration=3.288195914 podStartE2EDuration="3.288195914s" podCreationTimestamp="2026-03-18 12:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:30:46.282958015 +0000 UTC m=+1265.832878664" watchObservedRunningTime="2026-03-18 12:30:46.288195914 +0000 UTC m=+1265.838116553" Mar 18 12:30:46 crc kubenswrapper[4921]: I0318 12:30:46.655966 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:46 crc kubenswrapper[4921]: E0318 12:30:46.656215 4921 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:30:46 crc kubenswrapper[4921]: E0318 12:30:46.656248 4921 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:30:46 crc kubenswrapper[4921]: E0318 12:30:46.656313 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift podName:2204df50-7907-4d3b-a8b3-5aee222044f2 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:48.656290114 +0000 UTC m=+1268.206210763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift") pod "swift-storage-0" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2") : configmap "swift-ring-files" not found Mar 18 12:30:47 crc kubenswrapper[4921]: I0318 12:30:47.218052 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e227090-452d-4ca2-8b06-13327a1aa4f8" path="/var/lib/kubelet/pods/1e227090-452d-4ca2-8b06-13327a1aa4f8/volumes" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.496625 4921 scope.go:117] "RemoveContainer" containerID="872a95c472a62a31d048e26afdf3b12b40482aa2dcc422059ad01003c581709c" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.691702 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:48 crc kubenswrapper[4921]: E0318 12:30:48.691899 4921 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:30:48 crc kubenswrapper[4921]: E0318 12:30:48.691941 4921 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:30:48 crc kubenswrapper[4921]: E0318 12:30:48.692005 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift podName:2204df50-7907-4d3b-a8b3-5aee222044f2 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:52.691982971 +0000 UTC m=+1272.241903620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift") pod "swift-storage-0" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2") : configmap "swift-ring-files" not found Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.715557 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cvrpq"] Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.716636 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.727560 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cvrpq"] Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.729193 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.895522 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2313771d-6034-43c4-8f73-8d67111db536-operator-scripts\") pod \"root-account-create-update-cvrpq\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.895692 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chb7\" (UniqueName: \"kubernetes.io/projected/2313771d-6034-43c4-8f73-8d67111db536-kube-api-access-8chb7\") pod \"root-account-create-update-cvrpq\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.997168 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2313771d-6034-43c4-8f73-8d67111db536-operator-scripts\") pod \"root-account-create-update-cvrpq\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.997282 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chb7\" (UniqueName: \"kubernetes.io/projected/2313771d-6034-43c4-8f73-8d67111db536-kube-api-access-8chb7\") pod \"root-account-create-update-cvrpq\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:48 crc kubenswrapper[4921]: I0318 12:30:48.998178 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2313771d-6034-43c4-8f73-8d67111db536-operator-scripts\") pod \"root-account-create-update-cvrpq\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:49 crc kubenswrapper[4921]: I0318 12:30:49.032282 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chb7\" (UniqueName: \"kubernetes.io/projected/2313771d-6034-43c4-8f73-8d67111db536-kube-api-access-8chb7\") pod \"root-account-create-update-cvrpq\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:49 crc kubenswrapper[4921]: I0318 12:30:49.332294 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:49 crc kubenswrapper[4921]: I0318 12:30:49.781730 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cvrpq"] Mar 18 12:30:50 crc kubenswrapper[4921]: I0318 12:30:50.264898 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cvrpq" event={"ID":"2313771d-6034-43c4-8f73-8d67111db536","Type":"ContainerStarted","Data":"210833d6eb9a2ee47d9fb8e0049a2f165dc00d3c13a7d774214751cfd6d0f36f"} Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.566520 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kjxvt"] Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.567819 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.617214 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kjxvt"] Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.688597 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-16b4-account-create-update-5nmq2"] Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.690010 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.693461 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.697840 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-16b4-account-create-update-5nmq2"] Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.741779 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6de4f-07c3-4217-956b-22b24f366220-operator-scripts\") pod \"glance-db-create-kjxvt\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.741878 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0f23d5-c8db-45a5-8c30-c746a0157440-operator-scripts\") pod \"glance-16b4-account-create-update-5nmq2\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.742347 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dzpm\" (UniqueName: \"kubernetes.io/projected/c7a6de4f-07c3-4217-956b-22b24f366220-kube-api-access-9dzpm\") pod \"glance-db-create-kjxvt\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.843602 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgdpb\" (UniqueName: \"kubernetes.io/projected/8f0f23d5-c8db-45a5-8c30-c746a0157440-kube-api-access-fgdpb\") pod \"glance-16b4-account-create-update-5nmq2\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.843722 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dzpm\" (UniqueName: \"kubernetes.io/projected/c7a6de4f-07c3-4217-956b-22b24f366220-kube-api-access-9dzpm\") pod \"glance-db-create-kjxvt\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.843781 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6de4f-07c3-4217-956b-22b24f366220-operator-scripts\") pod \"glance-db-create-kjxvt\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.843871 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0f23d5-c8db-45a5-8c30-c746a0157440-operator-scripts\") pod \"glance-16b4-account-create-update-5nmq2\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.844659 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6de4f-07c3-4217-956b-22b24f366220-operator-scripts\") pod \"glance-db-create-kjxvt\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.845060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0f23d5-c8db-45a5-8c30-c746a0157440-operator-scripts\") pod \"glance-16b4-account-create-update-5nmq2\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.865312 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dzpm\" (UniqueName: \"kubernetes.io/projected/c7a6de4f-07c3-4217-956b-22b24f366220-kube-api-access-9dzpm\") pod \"glance-db-create-kjxvt\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.901434 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.945676 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgdpb\" (UniqueName: \"kubernetes.io/projected/8f0f23d5-c8db-45a5-8c30-c746a0157440-kube-api-access-fgdpb\") pod \"glance-16b4-account-create-update-5nmq2\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:51 crc kubenswrapper[4921]: I0318 12:30:51.961639 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgdpb\" (UniqueName: \"kubernetes.io/projected/8f0f23d5-c8db-45a5-8c30-c746a0157440-kube-api-access-fgdpb\") pod \"glance-16b4-account-create-update-5nmq2\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.022385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.297249 4921 generic.go:334] "Generic (PLEG): container finished" podID="2313771d-6034-43c4-8f73-8d67111db536" containerID="9af29f28d1d0bacc1715100e8b5b4d78585264b9810296a3a93d8895a39127c2" exitCode=0 Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.297333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cvrpq" event={"ID":"2313771d-6034-43c4-8f73-8d67111db536","Type":"ContainerDied","Data":"9af29f28d1d0bacc1715100e8b5b4d78585264b9810296a3a93d8895a39127c2"} Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.355560 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kjxvt"] Mar 18 12:30:52 crc kubenswrapper[4921]: W0318 12:30:52.360470 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7a6de4f_07c3_4217_956b_22b24f366220.slice/crio-be3072c8e23331eaaa99b805abbc4db10848d0417d3da876ecf5c4b7b7a4f226 WatchSource:0}: Error finding container be3072c8e23331eaaa99b805abbc4db10848d0417d3da876ecf5c4b7b7a4f226: Status 404 returned error can't find the container with id be3072c8e23331eaaa99b805abbc4db10848d0417d3da876ecf5c4b7b7a4f226 Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.484471 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-16b4-account-create-update-5nmq2"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.530567 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jdm4q"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.531976 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.539323 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jdm4q"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.630594 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4f7a-account-create-update-xn88l"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.631661 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.633983 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.643337 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4f7a-account-create-update-xn88l"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.657030 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698lr\" (UniqueName: \"kubernetes.io/projected/8872fccc-7e47-4ab7-8b31-c81b93fc72de-kube-api-access-698lr\") pod \"keystone-db-create-jdm4q\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.657211 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8872fccc-7e47-4ab7-8b31-c81b93fc72de-operator-scripts\") pod \"keystone-db-create-jdm4q\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.759156 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.759244 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698lr\" (UniqueName: \"kubernetes.io/projected/8872fccc-7e47-4ab7-8b31-c81b93fc72de-kube-api-access-698lr\") pod \"keystone-db-create-jdm4q\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.759338 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9fr\" (UniqueName: \"kubernetes.io/projected/aaa03737-632a-4574-a31a-93b05e3be3f0-kube-api-access-nq9fr\") pod \"keystone-4f7a-account-create-update-xn88l\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.759387 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8872fccc-7e47-4ab7-8b31-c81b93fc72de-operator-scripts\") pod \"keystone-db-create-jdm4q\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: E0318 12:30:52.759412 4921 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:30:52 crc kubenswrapper[4921]: E0318 12:30:52.759442 4921 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:30:52 crc kubenswrapper[4921]: E0318 12:30:52.759499 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift podName:2204df50-7907-4d3b-a8b3-5aee222044f2 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:00.759476554 +0000 UTC m=+1280.309397263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift") pod "swift-storage-0" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2") : configmap "swift-ring-files" not found Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.759422 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa03737-632a-4574-a31a-93b05e3be3f0-operator-scripts\") pod \"keystone-4f7a-account-create-update-xn88l\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.760932 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8872fccc-7e47-4ab7-8b31-c81b93fc72de-operator-scripts\") pod \"keystone-db-create-jdm4q\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.766178 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ppcmx"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.767357 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.772410 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ppcmx"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.785946 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698lr\" (UniqueName: \"kubernetes.io/projected/8872fccc-7e47-4ab7-8b31-c81b93fc72de-kube-api-access-698lr\") pod \"keystone-db-create-jdm4q\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.836893 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9fc8-account-create-update-k6glt"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.838593 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.840756 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.847160 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9fc8-account-create-update-k6glt"] Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.861198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9fr\" (UniqueName: \"kubernetes.io/projected/aaa03737-632a-4574-a31a-93b05e3be3f0-kube-api-access-nq9fr\") pod \"keystone-4f7a-account-create-update-xn88l\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.861293 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa03737-632a-4574-a31a-93b05e3be3f0-operator-scripts\") pod \"keystone-4f7a-account-create-update-xn88l\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.861982 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa03737-632a-4574-a31a-93b05e3be3f0-operator-scripts\") pod \"keystone-4f7a-account-create-update-xn88l\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.869208 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.896824 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9fr\" (UniqueName: \"kubernetes.io/projected/aaa03737-632a-4574-a31a-93b05e3be3f0-kube-api-access-nq9fr\") pod \"keystone-4f7a-account-create-update-xn88l\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.955539 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.963412 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rfmt\" (UniqueName: \"kubernetes.io/projected/e7747f3a-18ba-402a-8d56-d6f7d359f1de-kube-api-access-7rfmt\") pod \"placement-9fc8-account-create-update-k6glt\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.963552 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7747f3a-18ba-402a-8d56-d6f7d359f1de-operator-scripts\") pod \"placement-9fc8-account-create-update-k6glt\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.963677 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlqzz\" (UniqueName: \"kubernetes.io/projected/9b6306e6-563d-4511-ba58-e685c1c9a599-kube-api-access-dlqzz\") pod \"placement-db-create-ppcmx\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:52 crc kubenswrapper[4921]: I0318 12:30:52.963763 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b6306e6-563d-4511-ba58-e685c1c9a599-operator-scripts\") pod \"placement-db-create-ppcmx\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.065317 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7747f3a-18ba-402a-8d56-d6f7d359f1de-operator-scripts\") pod \"placement-9fc8-account-create-update-k6glt\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.065431 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlqzz\" (UniqueName: \"kubernetes.io/projected/9b6306e6-563d-4511-ba58-e685c1c9a599-kube-api-access-dlqzz\") pod \"placement-db-create-ppcmx\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.065501 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b6306e6-563d-4511-ba58-e685c1c9a599-operator-scripts\") pod \"placement-db-create-ppcmx\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.065527 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rfmt\" (UniqueName: \"kubernetes.io/projected/e7747f3a-18ba-402a-8d56-d6f7d359f1de-kube-api-access-7rfmt\") pod \"placement-9fc8-account-create-update-k6glt\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.066406 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7747f3a-18ba-402a-8d56-d6f7d359f1de-operator-scripts\") pod \"placement-9fc8-account-create-update-k6glt\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.068040 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b6306e6-563d-4511-ba58-e685c1c9a599-operator-scripts\") pod \"placement-db-create-ppcmx\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.081980 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rfmt\" (UniqueName: \"kubernetes.io/projected/e7747f3a-18ba-402a-8d56-d6f7d359f1de-kube-api-access-7rfmt\") pod \"placement-9fc8-account-create-update-k6glt\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.092039 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlqzz\" (UniqueName: \"kubernetes.io/projected/9b6306e6-563d-4511-ba58-e685c1c9a599-kube-api-access-dlqzz\") pod \"placement-db-create-ppcmx\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.153742 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.312032 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kjxvt" event={"ID":"c7a6de4f-07c3-4217-956b-22b24f366220","Type":"ContainerStarted","Data":"be3072c8e23331eaaa99b805abbc4db10848d0417d3da876ecf5c4b7b7a4f226"} Mar 18 12:30:53 crc kubenswrapper[4921]: I0318 12:30:53.387209 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:54 crc kubenswrapper[4921]: I0318 12:30:54.078478 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:30:54 crc kubenswrapper[4921]: I0318 12:30:54.143645 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvhbz"] Mar 18 12:30:54 crc kubenswrapper[4921]: I0318 12:30:54.143997 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerName="dnsmasq-dns" containerID="cri-o://fa2c32b2c306a96413d59bd0681650cb0a0da204cbf58820194a33330fabaa55" gracePeriod=10 Mar 18 12:30:54 crc kubenswrapper[4921]: I0318 12:30:54.323854 4921 generic.go:334] "Generic (PLEG): container finished" podID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerID="fa2c32b2c306a96413d59bd0681650cb0a0da204cbf58820194a33330fabaa55" exitCode=0 Mar 18 12:30:54 crc kubenswrapper[4921]: I0318 12:30:54.323899 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" event={"ID":"c73a721b-d0ec-48bb-8106-c35c9d1d3466","Type":"ContainerDied","Data":"fa2c32b2c306a96413d59bd0681650cb0a0da204cbf58820194a33330fabaa55"} Mar 18 12:30:54 crc kubenswrapper[4921]: W0318 12:30:54.750044 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f0f23d5_c8db_45a5_8c30_c746a0157440.slice/crio-7d3c94b856b43de51b210db21126709d39d1fd9d3087febc67db9a33e27585f7 WatchSource:0}: Error finding container 7d3c94b856b43de51b210db21126709d39d1fd9d3087febc67db9a33e27585f7: Status 404 returned error can't find the container with id 7d3c94b856b43de51b210db21126709d39d1fd9d3087febc67db9a33e27585f7 Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.018704 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.124101 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.131082 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2313771d-6034-43c4-8f73-8d67111db536-operator-scripts\") pod \"2313771d-6034-43c4-8f73-8d67111db536\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.131397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8chb7\" (UniqueName: \"kubernetes.io/projected/2313771d-6034-43c4-8f73-8d67111db536-kube-api-access-8chb7\") pod \"2313771d-6034-43c4-8f73-8d67111db536\" (UID: \"2313771d-6034-43c4-8f73-8d67111db536\") " Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.133896 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2313771d-6034-43c4-8f73-8d67111db536-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2313771d-6034-43c4-8f73-8d67111db536" (UID: "2313771d-6034-43c4-8f73-8d67111db536"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.140027 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2313771d-6034-43c4-8f73-8d67111db536-kube-api-access-8chb7" (OuterVolumeSpecName: "kube-api-access-8chb7") pod "2313771d-6034-43c4-8f73-8d67111db536" (UID: "2313771d-6034-43c4-8f73-8d67111db536"). InnerVolumeSpecName "kube-api-access-8chb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.235112 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8wb6\" (UniqueName: \"kubernetes.io/projected/c73a721b-d0ec-48bb-8106-c35c9d1d3466-kube-api-access-p8wb6\") pod \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.235740 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-dns-svc\") pod \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.235788 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-config\") pod \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\" (UID: \"c73a721b-d0ec-48bb-8106-c35c9d1d3466\") " Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.236122 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2313771d-6034-43c4-8f73-8d67111db536-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.236152 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8chb7\" (UniqueName: \"kubernetes.io/projected/2313771d-6034-43c4-8f73-8d67111db536-kube-api-access-8chb7\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.266855 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73a721b-d0ec-48bb-8106-c35c9d1d3466-kube-api-access-p8wb6" (OuterVolumeSpecName: "kube-api-access-p8wb6") pod "c73a721b-d0ec-48bb-8106-c35c9d1d3466" (UID: "c73a721b-d0ec-48bb-8106-c35c9d1d3466"). InnerVolumeSpecName "kube-api-access-p8wb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.350325 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8wb6\" (UniqueName: \"kubernetes.io/projected/c73a721b-d0ec-48bb-8106-c35c9d1d3466-kube-api-access-p8wb6\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.390993 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kjxvt" event={"ID":"c7a6de4f-07c3-4217-956b-22b24f366220","Type":"ContainerStarted","Data":"c0cfcab1c54f029107c9b49c527245287e48346e19a182d6826532d18b88fff2"} Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.393032 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j94bd" event={"ID":"f6989392-6285-4c0d-80e4-3d2d30461e4f","Type":"ContainerStarted","Data":"70abdc91b6f2fb58442ccc59e3cfb2b789054925784b519666261caf23ed88b3"} Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.408024 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-16b4-account-create-update-5nmq2" event={"ID":"8f0f23d5-c8db-45a5-8c30-c746a0157440","Type":"ContainerStarted","Data":"f0948baeeda676ec306be8c3a4d0f0f3fb0ef395106571ae3905a145ee1a1119"} Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.411846 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-16b4-account-create-update-5nmq2" event={"ID":"8f0f23d5-c8db-45a5-8c30-c746a0157440","Type":"ContainerStarted","Data":"7d3c94b856b43de51b210db21126709d39d1fd9d3087febc67db9a33e27585f7"} Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.425797 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.426152 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvhbz" event={"ID":"c73a721b-d0ec-48bb-8106-c35c9d1d3466","Type":"ContainerDied","Data":"a4b3a0fb211fe7ef0aa26331234d33f0ecb586b419593c56430f9bec4868febc"} Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.426189 4921 scope.go:117] "RemoveContainer" containerID="fa2c32b2c306a96413d59bd0681650cb0a0da204cbf58820194a33330fabaa55" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.438554 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-kjxvt" podStartSLOduration=4.438531445 podStartE2EDuration="4.438531445s" podCreationTimestamp="2026-03-18 12:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:30:55.425384501 +0000 UTC m=+1274.975305150" watchObservedRunningTime="2026-03-18 12:30:55.438531445 +0000 UTC m=+1274.988452074" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.452739 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cvrpq" event={"ID":"2313771d-6034-43c4-8f73-8d67111db536","Type":"ContainerDied","Data":"210833d6eb9a2ee47d9fb8e0049a2f165dc00d3c13a7d774214751cfd6d0f36f"} Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.452787 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210833d6eb9a2ee47d9fb8e0049a2f165dc00d3c13a7d774214751cfd6d0f36f" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.452847 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cvrpq" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.463891 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-config" (OuterVolumeSpecName: "config") pod "c73a721b-d0ec-48bb-8106-c35c9d1d3466" (UID: "c73a721b-d0ec-48bb-8106-c35c9d1d3466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.478395 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ppcmx"] Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.492253 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c73a721b-d0ec-48bb-8106-c35c9d1d3466" (UID: "c73a721b-d0ec-48bb-8106-c35c9d1d3466"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.499804 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j94bd" podStartSLOduration=1.664699235 podStartE2EDuration="10.499780576s" podCreationTimestamp="2026-03-18 12:30:45 +0000 UTC" firstStartedPulling="2026-03-18 12:30:46.036768438 +0000 UTC m=+1265.586689077" lastFinishedPulling="2026-03-18 12:30:54.871849779 +0000 UTC m=+1274.421770418" observedRunningTime="2026-03-18 12:30:55.453557392 +0000 UTC m=+1275.003478031" watchObservedRunningTime="2026-03-18 12:30:55.499780576 +0000 UTC m=+1275.049701215" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.501527 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-16b4-account-create-update-5nmq2" podStartSLOduration=4.501515405 podStartE2EDuration="4.501515405s" podCreationTimestamp="2026-03-18 12:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:30:55.485944142 +0000 UTC m=+1275.035864781" watchObservedRunningTime="2026-03-18 12:30:55.501515405 +0000 UTC m=+1275.051436044" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.502856 4921 scope.go:117] "RemoveContainer" containerID="3508bb686babe7a6f0e49ba27aef0d00dcfce1ee6ff5588ed84d3b681ae3dcce" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.554226 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.554259 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73a721b-d0ec-48bb-8106-c35c9d1d3466-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.742181 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9fc8-account-create-update-k6glt"] Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.753760 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4f7a-account-create-update-xn88l"] Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.763169 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jdm4q"] Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.981025 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvhbz"] Mar 18 12:30:55 crc kubenswrapper[4921]: I0318 12:30:55.994761 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvhbz"] Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.461094 4921 generic.go:334] "Generic (PLEG): container finished" podID="8872fccc-7e47-4ab7-8b31-c81b93fc72de" containerID="7c94fca6c5293bb11d0c047033353aa8038883ef1a81a2a71b1ff8f8c52e7d06" exitCode=0 Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.461163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jdm4q" event={"ID":"8872fccc-7e47-4ab7-8b31-c81b93fc72de","Type":"ContainerDied","Data":"7c94fca6c5293bb11d0c047033353aa8038883ef1a81a2a71b1ff8f8c52e7d06"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.461714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jdm4q" event={"ID":"8872fccc-7e47-4ab7-8b31-c81b93fc72de","Type":"ContainerStarted","Data":"dfaa7bd319fd8f56246e04dd873db0dc06eca6c0f300eb9d8a15b803cc162d8b"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.465959 4921 generic.go:334] "Generic (PLEG): container finished" podID="e7747f3a-18ba-402a-8d56-d6f7d359f1de" containerID="fdd42d8f01500d855fa39f90591a9243c08485ff85370402767a972c29d33ddc" exitCode=0 Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.466073 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9fc8-account-create-update-k6glt" event={"ID":"e7747f3a-18ba-402a-8d56-d6f7d359f1de","Type":"ContainerDied","Data":"fdd42d8f01500d855fa39f90591a9243c08485ff85370402767a972c29d33ddc"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.466199 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9fc8-account-create-update-k6glt" event={"ID":"e7747f3a-18ba-402a-8d56-d6f7d359f1de","Type":"ContainerStarted","Data":"accedaca4908472d661212823b48b3b3d674cea8bdd74b35e1f675ddacdfbe7a"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.468544 4921 generic.go:334] "Generic (PLEG): container finished" podID="aaa03737-632a-4574-a31a-93b05e3be3f0" containerID="96fa5b7f7c8a41b2a620669f22f68be7614f4871e7bc352d6e8f084f65808780" exitCode=0 Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.468615 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4f7a-account-create-update-xn88l" event={"ID":"aaa03737-632a-4574-a31a-93b05e3be3f0","Type":"ContainerDied","Data":"96fa5b7f7c8a41b2a620669f22f68be7614f4871e7bc352d6e8f084f65808780"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.468673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4f7a-account-create-update-xn88l" event={"ID":"aaa03737-632a-4574-a31a-93b05e3be3f0","Type":"ContainerStarted","Data":"9c0b8ba1b2c4ea80dc378f2269853937918b386cdcc3973244ad01cead0cff03"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.470566 4921 generic.go:334] "Generic (PLEG): container finished" podID="c7a6de4f-07c3-4217-956b-22b24f366220" containerID="c0cfcab1c54f029107c9b49c527245287e48346e19a182d6826532d18b88fff2" exitCode=0 Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.470615 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kjxvt" event={"ID":"c7a6de4f-07c3-4217-956b-22b24f366220","Type":"ContainerDied","Data":"c0cfcab1c54f029107c9b49c527245287e48346e19a182d6826532d18b88fff2"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.473230 4921 generic.go:334] "Generic (PLEG): container finished" podID="9b6306e6-563d-4511-ba58-e685c1c9a599" containerID="4c273d1e1bd70e964314248da90f6136b916bbbe7e760714e621f8c923c98fa7" exitCode=0 Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.473306 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ppcmx" event={"ID":"9b6306e6-563d-4511-ba58-e685c1c9a599","Type":"ContainerDied","Data":"4c273d1e1bd70e964314248da90f6136b916bbbe7e760714e621f8c923c98fa7"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.473331 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ppcmx" event={"ID":"9b6306e6-563d-4511-ba58-e685c1c9a599","Type":"ContainerStarted","Data":"18d2b0dfd657420a081a90ca342f46495d72561f9e11a16dfc8e0e98361a9c55"} Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.474912 4921 generic.go:334] "Generic (PLEG): container finished" podID="8f0f23d5-c8db-45a5-8c30-c746a0157440" containerID="f0948baeeda676ec306be8c3a4d0f0f3fb0ef395106571ae3905a145ee1a1119" exitCode=0 Mar 18 12:30:56 crc kubenswrapper[4921]: I0318 12:30:56.474953 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-16b4-account-create-update-5nmq2" event={"ID":"8f0f23d5-c8db-45a5-8c30-c746a0157440","Type":"ContainerDied","Data":"f0948baeeda676ec306be8c3a4d0f0f3fb0ef395106571ae3905a145ee1a1119"} Mar 18 12:30:57 crc kubenswrapper[4921]: I0318 12:30:57.220086 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" path="/var/lib/kubelet/pods/c73a721b-d0ec-48bb-8106-c35c9d1d3466/volumes" Mar 18 12:30:57 crc kubenswrapper[4921]: I0318 12:30:57.966041 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.112003 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rfmt\" (UniqueName: \"kubernetes.io/projected/e7747f3a-18ba-402a-8d56-d6f7d359f1de-kube-api-access-7rfmt\") pod \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.112542 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7747f3a-18ba-402a-8d56-d6f7d359f1de-operator-scripts\") pod \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\" (UID: \"e7747f3a-18ba-402a-8d56-d6f7d359f1de\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.113425 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7747f3a-18ba-402a-8d56-d6f7d359f1de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7747f3a-18ba-402a-8d56-d6f7d359f1de" (UID: "e7747f3a-18ba-402a-8d56-d6f7d359f1de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.113802 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7747f3a-18ba-402a-8d56-d6f7d359f1de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.118589 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7747f3a-18ba-402a-8d56-d6f7d359f1de-kube-api-access-7rfmt" (OuterVolumeSpecName: "kube-api-access-7rfmt") pod "e7747f3a-18ba-402a-8d56-d6f7d359f1de" (UID: "e7747f3a-18ba-402a-8d56-d6f7d359f1de"). InnerVolumeSpecName "kube-api-access-7rfmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.185973 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.206729 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.218606 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698lr\" (UniqueName: \"kubernetes.io/projected/8872fccc-7e47-4ab7-8b31-c81b93fc72de-kube-api-access-698lr\") pod \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.218676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b6306e6-563d-4511-ba58-e685c1c9a599-operator-scripts\") pod \"9b6306e6-563d-4511-ba58-e685c1c9a599\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.218724 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlqzz\" (UniqueName: \"kubernetes.io/projected/9b6306e6-563d-4511-ba58-e685c1c9a599-kube-api-access-dlqzz\") pod \"9b6306e6-563d-4511-ba58-e685c1c9a599\" (UID: \"9b6306e6-563d-4511-ba58-e685c1c9a599\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.218751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8872fccc-7e47-4ab7-8b31-c81b93fc72de-operator-scripts\") pod \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\" (UID: \"8872fccc-7e47-4ab7-8b31-c81b93fc72de\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.219387 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rfmt\" (UniqueName: \"kubernetes.io/projected/e7747f3a-18ba-402a-8d56-d6f7d359f1de-kube-api-access-7rfmt\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.220378 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6306e6-563d-4511-ba58-e685c1c9a599-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b6306e6-563d-4511-ba58-e685c1c9a599" (UID: "9b6306e6-563d-4511-ba58-e685c1c9a599"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.221794 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8872fccc-7e47-4ab7-8b31-c81b93fc72de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8872fccc-7e47-4ab7-8b31-c81b93fc72de" (UID: "8872fccc-7e47-4ab7-8b31-c81b93fc72de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.223694 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8872fccc-7e47-4ab7-8b31-c81b93fc72de-kube-api-access-698lr" (OuterVolumeSpecName: "kube-api-access-698lr") pod "8872fccc-7e47-4ab7-8b31-c81b93fc72de" (UID: "8872fccc-7e47-4ab7-8b31-c81b93fc72de"). InnerVolumeSpecName "kube-api-access-698lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.235674 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6306e6-563d-4511-ba58-e685c1c9a599-kube-api-access-dlqzz" (OuterVolumeSpecName: "kube-api-access-dlqzz") pod "9b6306e6-563d-4511-ba58-e685c1c9a599" (UID: "9b6306e6-563d-4511-ba58-e685c1c9a599"). InnerVolumeSpecName "kube-api-access-dlqzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.243970 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.262234 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.267810 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.320467 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq9fr\" (UniqueName: \"kubernetes.io/projected/aaa03737-632a-4574-a31a-93b05e3be3f0-kube-api-access-nq9fr\") pod \"aaa03737-632a-4574-a31a-93b05e3be3f0\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.320562 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0f23d5-c8db-45a5-8c30-c746a0157440-operator-scripts\") pod \"8f0f23d5-c8db-45a5-8c30-c746a0157440\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.320633 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgdpb\" (UniqueName: \"kubernetes.io/projected/8f0f23d5-c8db-45a5-8c30-c746a0157440-kube-api-access-fgdpb\") pod \"8f0f23d5-c8db-45a5-8c30-c746a0157440\" (UID: \"8f0f23d5-c8db-45a5-8c30-c746a0157440\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.320744 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa03737-632a-4574-a31a-93b05e3be3f0-operator-scripts\") pod \"aaa03737-632a-4574-a31a-93b05e3be3f0\" (UID: \"aaa03737-632a-4574-a31a-93b05e3be3f0\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.320773 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dzpm\" (UniqueName: \"kubernetes.io/projected/c7a6de4f-07c3-4217-956b-22b24f366220-kube-api-access-9dzpm\") pod \"c7a6de4f-07c3-4217-956b-22b24f366220\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.320799 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6de4f-07c3-4217-956b-22b24f366220-operator-scripts\") pod \"c7a6de4f-07c3-4217-956b-22b24f366220\" (UID: \"c7a6de4f-07c3-4217-956b-22b24f366220\") " Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321280 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698lr\" (UniqueName: \"kubernetes.io/projected/8872fccc-7e47-4ab7-8b31-c81b93fc72de-kube-api-access-698lr\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321305 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b6306e6-563d-4511-ba58-e685c1c9a599-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321318 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlqzz\" (UniqueName: \"kubernetes.io/projected/9b6306e6-563d-4511-ba58-e685c1c9a599-kube-api-access-dlqzz\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321329 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8872fccc-7e47-4ab7-8b31-c81b93fc72de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa03737-632a-4574-a31a-93b05e3be3f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aaa03737-632a-4574-a31a-93b05e3be3f0" (UID: "aaa03737-632a-4574-a31a-93b05e3be3f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321775 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7a6de4f-07c3-4217-956b-22b24f366220-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7a6de4f-07c3-4217-956b-22b24f366220" (UID: "c7a6de4f-07c3-4217-956b-22b24f366220"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.321781 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0f23d5-c8db-45a5-8c30-c746a0157440-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f0f23d5-c8db-45a5-8c30-c746a0157440" (UID: "8f0f23d5-c8db-45a5-8c30-c746a0157440"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.324152 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0f23d5-c8db-45a5-8c30-c746a0157440-kube-api-access-fgdpb" (OuterVolumeSpecName: "kube-api-access-fgdpb") pod "8f0f23d5-c8db-45a5-8c30-c746a0157440" (UID: "8f0f23d5-c8db-45a5-8c30-c746a0157440"). InnerVolumeSpecName "kube-api-access-fgdpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.324853 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa03737-632a-4574-a31a-93b05e3be3f0-kube-api-access-nq9fr" (OuterVolumeSpecName: "kube-api-access-nq9fr") pod "aaa03737-632a-4574-a31a-93b05e3be3f0" (UID: "aaa03737-632a-4574-a31a-93b05e3be3f0"). InnerVolumeSpecName "kube-api-access-nq9fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.325288 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a6de4f-07c3-4217-956b-22b24f366220-kube-api-access-9dzpm" (OuterVolumeSpecName: "kube-api-access-9dzpm") pod "c7a6de4f-07c3-4217-956b-22b24f366220" (UID: "c7a6de4f-07c3-4217-956b-22b24f366220"). InnerVolumeSpecName "kube-api-access-9dzpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.422306 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgdpb\" (UniqueName: \"kubernetes.io/projected/8f0f23d5-c8db-45a5-8c30-c746a0157440-kube-api-access-fgdpb\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.422335 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaa03737-632a-4574-a31a-93b05e3be3f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.422345 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dzpm\" (UniqueName: \"kubernetes.io/projected/c7a6de4f-07c3-4217-956b-22b24f366220-kube-api-access-9dzpm\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.422353 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a6de4f-07c3-4217-956b-22b24f366220-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.422362 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq9fr\" (UniqueName: \"kubernetes.io/projected/aaa03737-632a-4574-a31a-93b05e3be3f0-kube-api-access-nq9fr\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.422370 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0f23d5-c8db-45a5-8c30-c746a0157440-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.492581 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jdm4q" event={"ID":"8872fccc-7e47-4ab7-8b31-c81b93fc72de","Type":"ContainerDied","Data":"dfaa7bd319fd8f56246e04dd873db0dc06eca6c0f300eb9d8a15b803cc162d8b"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.492621 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfaa7bd319fd8f56246e04dd873db0dc06eca6c0f300eb9d8a15b803cc162d8b" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.492668 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jdm4q" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.507825 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9fc8-account-create-update-k6glt" event={"ID":"e7747f3a-18ba-402a-8d56-d6f7d359f1de","Type":"ContainerDied","Data":"accedaca4908472d661212823b48b3b3d674cea8bdd74b35e1f675ddacdfbe7a"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.507864 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="accedaca4908472d661212823b48b3b3d674cea8bdd74b35e1f675ddacdfbe7a" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.507917 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9fc8-account-create-update-k6glt" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.518922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4f7a-account-create-update-xn88l" event={"ID":"aaa03737-632a-4574-a31a-93b05e3be3f0","Type":"ContainerDied","Data":"9c0b8ba1b2c4ea80dc378f2269853937918b386cdcc3973244ad01cead0cff03"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.518958 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c0b8ba1b2c4ea80dc378f2269853937918b386cdcc3973244ad01cead0cff03" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.519016 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-xn88l" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.526827 4921 generic.go:334] "Generic (PLEG): container finished" podID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerID="a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed" exitCode=0 Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.526881 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef935990-b291-43b7-9d56-673b7b05a7a7","Type":"ContainerDied","Data":"a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.528944 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kjxvt" event={"ID":"c7a6de4f-07c3-4217-956b-22b24f366220","Type":"ContainerDied","Data":"be3072c8e23331eaaa99b805abbc4db10848d0417d3da876ecf5c4b7b7a4f226"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.528978 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be3072c8e23331eaaa99b805abbc4db10848d0417d3da876ecf5c4b7b7a4f226" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.529005 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kjxvt" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.530411 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ppcmx" event={"ID":"9b6306e6-563d-4511-ba58-e685c1c9a599","Type":"ContainerDied","Data":"18d2b0dfd657420a081a90ca342f46495d72561f9e11a16dfc8e0e98361a9c55"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.530440 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d2b0dfd657420a081a90ca342f46495d72561f9e11a16dfc8e0e98361a9c55" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.530512 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ppcmx" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.541751 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-16b4-account-create-update-5nmq2" event={"ID":"8f0f23d5-c8db-45a5-8c30-c746a0157440","Type":"ContainerDied","Data":"7d3c94b856b43de51b210db21126709d39d1fd9d3087febc67db9a33e27585f7"} Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.541800 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3c94b856b43de51b210db21126709d39d1fd9d3087febc67db9a33e27585f7" Mar 18 12:30:58 crc kubenswrapper[4921]: I0318 12:30:58.541877 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-16b4-account-create-update-5nmq2" Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.550816 4921 generic.go:334] "Generic (PLEG): container finished" podID="df692663-cc58-4cf1-a05b-566e0152ee90" containerID="dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d" exitCode=0 Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.550894 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df692663-cc58-4cf1-a05b-566e0152ee90","Type":"ContainerDied","Data":"dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d"} Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.553265 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef935990-b291-43b7-9d56-673b7b05a7a7","Type":"ContainerStarted","Data":"0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222"} Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.553556 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.639832 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.110504347 podStartE2EDuration="52.63981506s" podCreationTimestamp="2026-03-18 12:30:07 +0000 UTC" firstStartedPulling="2026-03-18 12:30:15.396585844 +0000 UTC m=+1234.946506483" lastFinishedPulling="2026-03-18 12:30:24.925896557 +0000 UTC m=+1244.475817196" observedRunningTime="2026-03-18 12:30:59.626885372 +0000 UTC m=+1279.176806011" watchObservedRunningTime="2026-03-18 12:30:59.63981506 +0000 UTC m=+1279.189735689" Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.962038 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cvrpq"] Mar 18 12:30:59 crc kubenswrapper[4921]: I0318 12:30:59.969920 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cvrpq"] Mar 18 12:31:00 crc kubenswrapper[4921]: I0318 12:31:00.125948 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 12:31:00 crc kubenswrapper[4921]: I0318 12:31:00.562916 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df692663-cc58-4cf1-a05b-566e0152ee90","Type":"ContainerStarted","Data":"330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537"} Mar 18 12:31:00 crc kubenswrapper[4921]: I0318 12:31:00.563334 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 12:31:00 crc kubenswrapper[4921]: I0318 12:31:00.610233 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.61021088 podStartE2EDuration="53.61021088s" podCreationTimestamp="2026-03-18 12:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:00.585003633 +0000 UTC m=+1280.134924272" watchObservedRunningTime="2026-03-18 12:31:00.61021088 +0000 UTC m=+1280.160131519" Mar 18 12:31:00 crc kubenswrapper[4921]: I0318 12:31:00.762333 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:31:00 crc kubenswrapper[4921]: E0318 12:31:00.762524 4921 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:31:00 crc kubenswrapper[4921]: E0318 12:31:00.762549 4921 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:31:00 crc kubenswrapper[4921]: E0318 12:31:00.762607 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift podName:2204df50-7907-4d3b-a8b3-5aee222044f2 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:16.76259117 +0000 UTC m=+1296.312511809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift") pod "swift-storage-0" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2") : configmap "swift-ring-files" not found Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.219386 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2313771d-6034-43c4-8f73-8d67111db536" path="/var/lib/kubelet/pods/2313771d-6034-43c4-8f73-8d67111db536/volumes" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.903114 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fcnk6"] Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.903904 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a6de4f-07c3-4217-956b-22b24f366220" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.903920 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a6de4f-07c3-4217-956b-22b24f366220" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.903943 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0f23d5-c8db-45a5-8c30-c746a0157440" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.903951 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0f23d5-c8db-45a5-8c30-c746a0157440" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.903970 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2313771d-6034-43c4-8f73-8d67111db536" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.903978 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2313771d-6034-43c4-8f73-8d67111db536" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.903994 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7747f3a-18ba-402a-8d56-d6f7d359f1de" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.904002 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7747f3a-18ba-402a-8d56-d6f7d359f1de" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.904021 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8872fccc-7e47-4ab7-8b31-c81b93fc72de" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.904028 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8872fccc-7e47-4ab7-8b31-c81b93fc72de" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.904042 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerName="init" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.904050 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerName="init" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.904064 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6306e6-563d-4511-ba58-e685c1c9a599" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.904073 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6306e6-563d-4511-ba58-e685c1c9a599" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.904087 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerName="dnsmasq-dns" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.904095 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerName="dnsmasq-dns" Mar 18 12:31:01 crc kubenswrapper[4921]: E0318 12:31:01.904114 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa03737-632a-4574-a31a-93b05e3be3f0" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905133 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa03737-632a-4574-a31a-93b05e3be3f0" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905398 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a6de4f-07c3-4217-956b-22b24f366220" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905417 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6306e6-563d-4511-ba58-e685c1c9a599" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905431 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73a721b-d0ec-48bb-8106-c35c9d1d3466" containerName="dnsmasq-dns" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905448 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0f23d5-c8db-45a5-8c30-c746a0157440" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905459 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7747f3a-18ba-402a-8d56-d6f7d359f1de" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905477 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8872fccc-7e47-4ab7-8b31-c81b93fc72de" containerName="mariadb-database-create" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905489 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa03737-632a-4574-a31a-93b05e3be3f0" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.905502 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2313771d-6034-43c4-8f73-8d67111db536" containerName="mariadb-account-create-update" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.906164 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.911230 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.911230 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p5jht" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.925286 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fcnk6"] Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.981864 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-config-data\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.981919 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjjb\" (UniqueName: \"kubernetes.io/projected/33ebd4aa-2278-4794-a26d-a26333a7fae3-kube-api-access-wjjjb\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.981963 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-combined-ca-bundle\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:01 crc kubenswrapper[4921]: I0318 12:31:01.981998 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-db-sync-config-data\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.083376 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-db-sync-config-data\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.083499 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-config-data\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.083525 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjjb\" (UniqueName: \"kubernetes.io/projected/33ebd4aa-2278-4794-a26d-a26333a7fae3-kube-api-access-wjjjb\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.083559 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-combined-ca-bundle\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.089727 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-db-sync-config-data\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.089727 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-combined-ca-bundle\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.089849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-config-data\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.108150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjjb\" (UniqueName: \"kubernetes.io/projected/33ebd4aa-2278-4794-a26d-a26333a7fae3-kube-api-access-wjjjb\") pod \"glance-db-sync-fcnk6\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.227414 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.577883 4921 generic.go:334] "Generic (PLEG): container finished" podID="f6989392-6285-4c0d-80e4-3d2d30461e4f" containerID="70abdc91b6f2fb58442ccc59e3cfb2b789054925784b519666261caf23ed88b3" exitCode=0 Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.577969 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j94bd" event={"ID":"f6989392-6285-4c0d-80e4-3d2d30461e4f","Type":"ContainerDied","Data":"70abdc91b6f2fb58442ccc59e3cfb2b789054925784b519666261caf23ed88b3"} Mar 18 12:31:02 crc kubenswrapper[4921]: I0318 12:31:02.786907 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fcnk6"] Mar 18 12:31:02 crc kubenswrapper[4921]: W0318 12:31:02.793508 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ebd4aa_2278_4794_a26d_a26333a7fae3.slice/crio-eb3940341843436ab23153be668475b4b95b02057fdaed6f95b762f450628168 WatchSource:0}: Error finding container eb3940341843436ab23153be668475b4b95b02057fdaed6f95b762f450628168: Status 404 returned error can't find the container with id eb3940341843436ab23153be668475b4b95b02057fdaed6f95b762f450628168 Mar 18 12:31:03 crc kubenswrapper[4921]: I0318 12:31:03.585740 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fcnk6" event={"ID":"33ebd4aa-2278-4794-a26d-a26333a7fae3","Type":"ContainerStarted","Data":"eb3940341843436ab23153be668475b4b95b02057fdaed6f95b762f450628168"} Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.007963 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.116818 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-ring-data-devices\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.117965 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-dispersionconf\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.118094 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6989392-6285-4c0d-80e4-3d2d30461e4f-etc-swift\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.118217 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8xl\" (UniqueName: \"kubernetes.io/projected/f6989392-6285-4c0d-80e4-3d2d30461e4f-kube-api-access-jc8xl\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.118361 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-combined-ca-bundle\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.118492 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-swiftconf\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.118572 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-scripts\") pod \"f6989392-6285-4c0d-80e4-3d2d30461e4f\" (UID: \"f6989392-6285-4c0d-80e4-3d2d30461e4f\") " Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.117903 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.120683 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6989392-6285-4c0d-80e4-3d2d30461e4f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.136069 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6989392-6285-4c0d-80e4-3d2d30461e4f-kube-api-access-jc8xl" (OuterVolumeSpecName: "kube-api-access-jc8xl") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "kube-api-access-jc8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.141354 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.143047 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.143687 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.145020 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-scripts" (OuterVolumeSpecName: "scripts") pod "f6989392-6285-4c0d-80e4-3d2d30461e4f" (UID: "f6989392-6285-4c0d-80e4-3d2d30461e4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220632 4921 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220666 4921 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220676 4921 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6989392-6285-4c0d-80e4-3d2d30461e4f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220685 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8xl\" (UniqueName: \"kubernetes.io/projected/f6989392-6285-4c0d-80e4-3d2d30461e4f-kube-api-access-jc8xl\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220694 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220703 4921 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6989392-6285-4c0d-80e4-3d2d30461e4f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.220712 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6989392-6285-4c0d-80e4-3d2d30461e4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.600574 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j94bd" event={"ID":"f6989392-6285-4c0d-80e4-3d2d30461e4f","Type":"ContainerDied","Data":"2b5e26a5ee5fb99033014633f25111f23cd46667dda55f0264924d7cdb35f340"} Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.600630 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b5e26a5ee5fb99033014633f25111f23cd46667dda55f0264924d7cdb35f340" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.600662 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j94bd" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.969371 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ftdrz"] Mar 18 12:31:04 crc kubenswrapper[4921]: E0318 12:31:04.969759 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6989392-6285-4c0d-80e4-3d2d30461e4f" containerName="swift-ring-rebalance" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.969779 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6989392-6285-4c0d-80e4-3d2d30461e4f" containerName="swift-ring-rebalance" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.969990 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6989392-6285-4c0d-80e4-3d2d30461e4f" containerName="swift-ring-rebalance" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.972338 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.975885 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 12:31:04 crc kubenswrapper[4921]: I0318 12:31:04.979709 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ftdrz"] Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.034547 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7214a61-816f-44f3-8ce2-3110d5819ad5-operator-scripts\") pod \"root-account-create-update-ftdrz\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.034630 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4psrk\" (UniqueName: \"kubernetes.io/projected/c7214a61-816f-44f3-8ce2-3110d5819ad5-kube-api-access-4psrk\") pod \"root-account-create-update-ftdrz\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.136087 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4psrk\" (UniqueName: \"kubernetes.io/projected/c7214a61-816f-44f3-8ce2-3110d5819ad5-kube-api-access-4psrk\") pod \"root-account-create-update-ftdrz\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.136317 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7214a61-816f-44f3-8ce2-3110d5819ad5-operator-scripts\") pod \"root-account-create-update-ftdrz\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.137817 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7214a61-816f-44f3-8ce2-3110d5819ad5-operator-scripts\") pod \"root-account-create-update-ftdrz\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.157573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4psrk\" (UniqueName: \"kubernetes.io/projected/c7214a61-816f-44f3-8ce2-3110d5819ad5-kube-api-access-4psrk\") pod \"root-account-create-update-ftdrz\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.325716 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:05 crc kubenswrapper[4921]: I0318 12:31:05.816702 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ftdrz"] Mar 18 12:31:06 crc kubenswrapper[4921]: I0318 12:31:06.621357 4921 generic.go:334] "Generic (PLEG): container finished" podID="c7214a61-816f-44f3-8ce2-3110d5819ad5" containerID="c97e73a4d065ed916e858b88a0002d0cab0e04c2e52616c161924f98fb974a12" exitCode=0 Mar 18 12:31:06 crc kubenswrapper[4921]: I0318 12:31:06.621397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ftdrz" event={"ID":"c7214a61-816f-44f3-8ce2-3110d5819ad5","Type":"ContainerDied","Data":"c97e73a4d065ed916e858b88a0002d0cab0e04c2e52616c161924f98fb974a12"} Mar 18 12:31:06 crc kubenswrapper[4921]: I0318 12:31:06.621420 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ftdrz" event={"ID":"c7214a61-816f-44f3-8ce2-3110d5819ad5","Type":"ContainerStarted","Data":"42c190e7acab765b7a4cb8f8b6fcd4d102065c9f5b9bfbb7ecbf0ae292ae544c"} Mar 18 12:31:06 crc kubenswrapper[4921]: I0318 12:31:06.803482 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djh4f" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:31:06 crc kubenswrapper[4921]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:31:06 crc kubenswrapper[4921]: > Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.059328 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.093917 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7214a61-816f-44f3-8ce2-3110d5819ad5-operator-scripts\") pod \"c7214a61-816f-44f3-8ce2-3110d5819ad5\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.094082 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4psrk\" (UniqueName: \"kubernetes.io/projected/c7214a61-816f-44f3-8ce2-3110d5819ad5-kube-api-access-4psrk\") pod \"c7214a61-816f-44f3-8ce2-3110d5819ad5\" (UID: \"c7214a61-816f-44f3-8ce2-3110d5819ad5\") " Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.095748 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7214a61-816f-44f3-8ce2-3110d5819ad5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7214a61-816f-44f3-8ce2-3110d5819ad5" (UID: "c7214a61-816f-44f3-8ce2-3110d5819ad5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.100465 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7214a61-816f-44f3-8ce2-3110d5819ad5-kube-api-access-4psrk" (OuterVolumeSpecName: "kube-api-access-4psrk") pod "c7214a61-816f-44f3-8ce2-3110d5819ad5" (UID: "c7214a61-816f-44f3-8ce2-3110d5819ad5"). InnerVolumeSpecName "kube-api-access-4psrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.196493 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4psrk\" (UniqueName: \"kubernetes.io/projected/c7214a61-816f-44f3-8ce2-3110d5819ad5-kube-api-access-4psrk\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.196538 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7214a61-816f-44f3-8ce2-3110d5819ad5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.659747 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ftdrz" event={"ID":"c7214a61-816f-44f3-8ce2-3110d5819ad5","Type":"ContainerDied","Data":"42c190e7acab765b7a4cb8f8b6fcd4d102065c9f5b9bfbb7ecbf0ae292ae544c"} Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.660087 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42c190e7acab765b7a4cb8f8b6fcd4d102065c9f5b9bfbb7ecbf0ae292ae544c" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.659794 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ftdrz" Mar 18 12:31:08 crc kubenswrapper[4921]: I0318 12:31:08.804338 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:31:09 crc kubenswrapper[4921]: I0318 12:31:09.170785 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 18 12:31:11 crc kubenswrapper[4921]: I0318 12:31:11.785701 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djh4f" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:31:11 crc kubenswrapper[4921]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:31:11 crc kubenswrapper[4921]: > Mar 18 12:31:11 crc kubenswrapper[4921]: I0318 12:31:11.812529 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:31:11 crc kubenswrapper[4921]: I0318 12:31:11.826544 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.039401 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djh4f-config-4b25x"] Mar 18 12:31:12 crc kubenswrapper[4921]: E0318 12:31:12.039822 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7214a61-816f-44f3-8ce2-3110d5819ad5" containerName="mariadb-account-create-update" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.039844 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7214a61-816f-44f3-8ce2-3110d5819ad5" containerName="mariadb-account-create-update" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.040054 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7214a61-816f-44f3-8ce2-3110d5819ad5" containerName="mariadb-account-create-update" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.040764 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.042269 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djh4f-config-4b25x"] Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.049561 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.109176 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.109238 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-log-ovn\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.109313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-additional-scripts\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.109365 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run-ovn\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.109397 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-scripts\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.109633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwcq\" (UniqueName: \"kubernetes.io/projected/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-kube-api-access-2vwcq\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.211595 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vwcq\" (UniqueName: \"kubernetes.io/projected/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-kube-api-access-2vwcq\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.211682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.211729 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-log-ovn\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.211792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-additional-scripts\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.211923 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run-ovn\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.212063 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-scripts\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.212081 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.212328 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run-ovn\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.212721 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-additional-scripts\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.212767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-log-ovn\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.214394 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-scripts\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.248099 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vwcq\" (UniqueName: \"kubernetes.io/projected/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-kube-api-access-2vwcq\") pod \"ovn-controller-djh4f-config-4b25x\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:12 crc kubenswrapper[4921]: I0318 12:31:12.378975 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:16 crc kubenswrapper[4921]: I0318 12:31:16.726838 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fcnk6" event={"ID":"33ebd4aa-2278-4794-a26d-a26333a7fae3","Type":"ContainerStarted","Data":"b97374f116bf95be9e05efb89dd70d18a2c903b071e0c70977ebb9bf22f1006d"} Mar 18 12:31:16 crc kubenswrapper[4921]: I0318 12:31:16.751846 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fcnk6" podStartSLOduration=2.587937179 podStartE2EDuration="15.75182981s" podCreationTimestamp="2026-03-18 12:31:01 +0000 UTC" firstStartedPulling="2026-03-18 12:31:02.796173326 +0000 UTC m=+1282.346093975" lastFinishedPulling="2026-03-18 12:31:15.960065947 +0000 UTC m=+1295.509986606" observedRunningTime="2026-03-18 12:31:16.749244586 +0000 UTC m=+1296.299165245" watchObservedRunningTime="2026-03-18 12:31:16.75182981 +0000 UTC m=+1296.301750449" Mar 18 12:31:16 crc kubenswrapper[4921]: I0318 12:31:16.778652 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djh4f-config-4b25x"] Mar 18 12:31:16 crc kubenswrapper[4921]: I0318 12:31:16.786440 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:31:16 crc kubenswrapper[4921]: I0318 12:31:16.790838 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"swift-storage-0\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " pod="openstack/swift-storage-0" Mar 18 12:31:16 crc kubenswrapper[4921]: I0318 12:31:16.816601 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-djh4f" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:31:16 crc kubenswrapper[4921]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:31:16 crc kubenswrapper[4921]: > Mar 18 12:31:17 crc kubenswrapper[4921]: I0318 12:31:17.014476 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:31:17 crc kubenswrapper[4921]: I0318 12:31:17.568380 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:31:17 crc kubenswrapper[4921]: W0318 12:31:17.579307 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2204df50_7907_4d3b_a8b3_5aee222044f2.slice/crio-f1267c5864f022b9dd760cd059eaba11c12e6c2923d4411e4de36265f39ab3d7 WatchSource:0}: Error finding container f1267c5864f022b9dd760cd059eaba11c12e6c2923d4411e4de36265f39ab3d7: Status 404 returned error can't find the container with id f1267c5864f022b9dd760cd059eaba11c12e6c2923d4411e4de36265f39ab3d7 Mar 18 12:31:17 crc kubenswrapper[4921]: I0318 12:31:17.735741 4921 generic.go:334] "Generic (PLEG): container finished" podID="3fd8cfc7-e37a-4115-90a4-d3df984a12b4" containerID="25f7c772beed4682c89d2283bef0c9fe44e3030f69adccee783f77ab5066eb4b" exitCode=0 Mar 18 12:31:17 crc kubenswrapper[4921]: I0318 12:31:17.736720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f-config-4b25x" event={"ID":"3fd8cfc7-e37a-4115-90a4-d3df984a12b4","Type":"ContainerDied","Data":"25f7c772beed4682c89d2283bef0c9fe44e3030f69adccee783f77ab5066eb4b"} Mar 18 12:31:17 crc kubenswrapper[4921]: I0318 12:31:17.736795 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f-config-4b25x" event={"ID":"3fd8cfc7-e37a-4115-90a4-d3df984a12b4","Type":"ContainerStarted","Data":"259c74ce4bf82e857fbc430a4a90707d919366b0e739f015c9b2f590884736dd"} Mar 18 12:31:17 crc kubenswrapper[4921]: I0318 12:31:17.738505 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"f1267c5864f022b9dd760cd059eaba11c12e6c2923d4411e4de36265f39ab3d7"} Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.034921 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.128643 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-scripts\") pod \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.128715 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-additional-scripts\") pod \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.128736 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vwcq\" (UniqueName: \"kubernetes.io/projected/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-kube-api-access-2vwcq\") pod \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.128779 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run-ovn\") pod \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.128842 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run\") pod \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.128887 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-log-ovn\") pod \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\" (UID: \"3fd8cfc7-e37a-4115-90a4-d3df984a12b4\") " Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.129270 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3fd8cfc7-e37a-4115-90a4-d3df984a12b4" (UID: "3fd8cfc7-e37a-4115-90a4-d3df984a12b4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.129300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3fd8cfc7-e37a-4115-90a4-d3df984a12b4" (UID: "3fd8cfc7-e37a-4115-90a4-d3df984a12b4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.129314 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run" (OuterVolumeSpecName: "var-run") pod "3fd8cfc7-e37a-4115-90a4-d3df984a12b4" (UID: "3fd8cfc7-e37a-4115-90a4-d3df984a12b4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.129607 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3fd8cfc7-e37a-4115-90a4-d3df984a12b4" (UID: "3fd8cfc7-e37a-4115-90a4-d3df984a12b4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.129865 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-scripts" (OuterVolumeSpecName: "scripts") pod "3fd8cfc7-e37a-4115-90a4-d3df984a12b4" (UID: "3fd8cfc7-e37a-4115-90a4-d3df984a12b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.133302 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-kube-api-access-2vwcq" (OuterVolumeSpecName: "kube-api-access-2vwcq") pod "3fd8cfc7-e37a-4115-90a4-d3df984a12b4" (UID: "3fd8cfc7-e37a-4115-90a4-d3df984a12b4"). InnerVolumeSpecName "kube-api-access-2vwcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.169091 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.230422 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.230453 4921 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.230464 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vwcq\" (UniqueName: \"kubernetes.io/projected/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-kube-api-access-2vwcq\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.230476 4921 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.230485 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.230493 4921 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fd8cfc7-e37a-4115-90a4-d3df984a12b4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.523894 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-rx99t"] Mar 18 12:31:19 crc kubenswrapper[4921]: E0318 12:31:19.524554 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd8cfc7-e37a-4115-90a4-d3df984a12b4" containerName="ovn-config" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.524576 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd8cfc7-e37a-4115-90a4-d3df984a12b4" containerName="ovn-config" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.524761 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd8cfc7-e37a-4115-90a4-d3df984a12b4" containerName="ovn-config" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.525258 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.565184 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rx99t"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.638175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-operator-scripts\") pod \"cinder-db-create-rx99t\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.638369 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtmm\" (UniqueName: \"kubernetes.io/projected/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-kube-api-access-kmtmm\") pod \"cinder-db-create-rx99t\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.684257 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4d6e-account-create-update-ksj2x"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.685708 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.696660 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.712757 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4d6e-account-create-update-ksj2x"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.740733 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-operator-scripts\") pod \"cinder-db-create-rx99t\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.740815 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml582\" (UniqueName: \"kubernetes.io/projected/865b5e1d-7747-4cc6-b9fb-e65784799085-kube-api-access-ml582\") pod \"cinder-4d6e-account-create-update-ksj2x\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.740875 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/865b5e1d-7747-4cc6-b9fb-e65784799085-operator-scripts\") pod \"cinder-4d6e-account-create-update-ksj2x\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.741056 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtmm\" (UniqueName: \"kubernetes.io/projected/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-kube-api-access-kmtmm\") pod \"cinder-db-create-rx99t\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.741605 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-operator-scripts\") pod \"cinder-db-create-rx99t\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.756976 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f-config-4b25x" event={"ID":"3fd8cfc7-e37a-4115-90a4-d3df984a12b4","Type":"ContainerDied","Data":"259c74ce4bf82e857fbc430a4a90707d919366b0e739f015c9b2f590884736dd"} Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.757026 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-4b25x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.757034 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259c74ce4bf82e857fbc430a4a90707d919366b0e739f015c9b2f590884736dd" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.760640 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"1777c4dc6cc0ebbcf08c1415f64541bce60850c8378a90e1f39c95269a83f819"} Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.760685 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"89a3992a11b9a42578661ade69e99403032115ef433aaf0df1389b585d36e00b"} Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.760700 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"72ba457268b54fa0d33c7866b23bca8be1894d0a484abe9be4ab2fd6c11abae3"} Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.760711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"0b7785aa69c2d4d5a0513e84fe33227f3ad20c98b78d1dcca6b047589db0a914"} Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.785686 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-j69tq"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.786892 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.792197 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtmm\" (UniqueName: \"kubernetes.io/projected/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-kube-api-access-kmtmm\") pod \"cinder-db-create-rx99t\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.839224 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.843358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fksr6\" (UniqueName: \"kubernetes.io/projected/19620ae3-0817-4f27-a363-c57b6b7a0a99-kube-api-access-fksr6\") pod \"barbican-db-create-j69tq\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.843418 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19620ae3-0817-4f27-a363-c57b6b7a0a99-operator-scripts\") pod \"barbican-db-create-j69tq\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.843629 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml582\" (UniqueName: \"kubernetes.io/projected/865b5e1d-7747-4cc6-b9fb-e65784799085-kube-api-access-ml582\") pod \"cinder-4d6e-account-create-update-ksj2x\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.843678 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/865b5e1d-7747-4cc6-b9fb-e65784799085-operator-scripts\") pod \"cinder-4d6e-account-create-update-ksj2x\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.844466 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/865b5e1d-7747-4cc6-b9fb-e65784799085-operator-scripts\") pod \"cinder-4d6e-account-create-update-ksj2x\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.860701 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2rjxw"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.862059 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.872097 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j69tq"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.881878 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml582\" (UniqueName: \"kubernetes.io/projected/865b5e1d-7747-4cc6-b9fb-e65784799085-kube-api-access-ml582\") pod \"cinder-4d6e-account-create-update-ksj2x\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.885797 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2rjxw"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.949705 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164592b-7066-42f2-a9f8-f87f2b3eb19e-operator-scripts\") pod \"neutron-db-create-2rjxw\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.949864 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjd7p\" (UniqueName: \"kubernetes.io/projected/2164592b-7066-42f2-a9f8-f87f2b3eb19e-kube-api-access-mjd7p\") pod \"neutron-db-create-2rjxw\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.949949 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fksr6\" (UniqueName: \"kubernetes.io/projected/19620ae3-0817-4f27-a363-c57b6b7a0a99-kube-api-access-fksr6\") pod \"barbican-db-create-j69tq\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.949990 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19620ae3-0817-4f27-a363-c57b6b7a0a99-operator-scripts\") pod \"barbican-db-create-j69tq\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.970189 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a2f6-account-create-update-229zv"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.971442 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.975284 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.983268 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a2f6-account-create-update-229zv"] Mar 18 12:31:19 crc kubenswrapper[4921]: I0318 12:31:19.988689 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19620ae3-0817-4f27-a363-c57b6b7a0a99-operator-scripts\") pod \"barbican-db-create-j69tq\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:19.994414 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fksr6\" (UniqueName: \"kubernetes.io/projected/19620ae3-0817-4f27-a363-c57b6b7a0a99-kube-api-access-fksr6\") pod \"barbican-db-create-j69tq\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.003329 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.022172 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-fqckd"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.023399 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.027068 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kqlk9" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.028590 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.029454 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.029606 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.051264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nn52\" (UniqueName: \"kubernetes.io/projected/09aa0ac1-8a57-4e30-b283-8a00711fa9df-kube-api-access-8nn52\") pod \"barbican-a2f6-account-create-update-229zv\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.051376 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164592b-7066-42f2-a9f8-f87f2b3eb19e-operator-scripts\") pod \"neutron-db-create-2rjxw\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.051439 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjd7p\" (UniqueName: \"kubernetes.io/projected/2164592b-7066-42f2-a9f8-f87f2b3eb19e-kube-api-access-mjd7p\") pod \"neutron-db-create-2rjxw\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.051473 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09aa0ac1-8a57-4e30-b283-8a00711fa9df-operator-scripts\") pod \"barbican-a2f6-account-create-update-229zv\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.052355 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164592b-7066-42f2-a9f8-f87f2b3eb19e-operator-scripts\") pod \"neutron-db-create-2rjxw\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.071863 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-fqckd"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.117682 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjd7p\" (UniqueName: \"kubernetes.io/projected/2164592b-7066-42f2-a9f8-f87f2b3eb19e-kube-api-access-mjd7p\") pod \"neutron-db-create-2rjxw\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.132563 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.154551 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-config-data\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.154697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98thr\" (UniqueName: \"kubernetes.io/projected/dceba139-8e2e-4533-b22c-08d898ffadb5-kube-api-access-98thr\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.154841 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09aa0ac1-8a57-4e30-b283-8a00711fa9df-operator-scripts\") pod \"barbican-a2f6-account-create-update-229zv\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.155010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nn52\" (UniqueName: \"kubernetes.io/projected/09aa0ac1-8a57-4e30-b283-8a00711fa9df-kube-api-access-8nn52\") pod \"barbican-a2f6-account-create-update-229zv\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.155040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-combined-ca-bundle\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.155801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09aa0ac1-8a57-4e30-b283-8a00711fa9df-operator-scripts\") pod \"barbican-a2f6-account-create-update-229zv\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.179248 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1264-account-create-update-djvfr"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.180474 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.182365 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.188462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nn52\" (UniqueName: \"kubernetes.io/projected/09aa0ac1-8a57-4e30-b283-8a00711fa9df-kube-api-access-8nn52\") pod \"barbican-a2f6-account-create-update-229zv\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.254528 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.266836 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/412d3034-b9e8-403d-a293-6c7ed02a7751-operator-scripts\") pod \"neutron-1264-account-create-update-djvfr\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.267122 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-combined-ca-bundle\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.267192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpcq6\" (UniqueName: \"kubernetes.io/projected/412d3034-b9e8-403d-a293-6c7ed02a7751-kube-api-access-cpcq6\") pod \"neutron-1264-account-create-update-djvfr\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.267399 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-config-data\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.267516 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98thr\" (UniqueName: \"kubernetes.io/projected/dceba139-8e2e-4533-b22c-08d898ffadb5-kube-api-access-98thr\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.272365 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-combined-ca-bundle\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.281971 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-config-data\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.283188 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1264-account-create-update-djvfr"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.294566 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.295594 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-djh4f-config-4b25x"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.302001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98thr\" (UniqueName: \"kubernetes.io/projected/dceba139-8e2e-4533-b22c-08d898ffadb5-kube-api-access-98thr\") pod \"keystone-db-sync-fqckd\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.305989 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-djh4f-config-4b25x"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.360219 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-djh4f-config-xxpzv"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.362084 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.365536 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.370486 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/412d3034-b9e8-403d-a293-6c7ed02a7751-operator-scripts\") pod \"neutron-1264-account-create-update-djvfr\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.370563 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpcq6\" (UniqueName: \"kubernetes.io/projected/412d3034-b9e8-403d-a293-6c7ed02a7751-kube-api-access-cpcq6\") pod \"neutron-1264-account-create-update-djvfr\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.371710 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/412d3034-b9e8-403d-a293-6c7ed02a7751-operator-scripts\") pod \"neutron-1264-account-create-update-djvfr\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.383483 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djh4f-config-xxpzv"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.387184 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.417623 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpcq6\" (UniqueName: \"kubernetes.io/projected/412d3034-b9e8-403d-a293-6c7ed02a7751-kube-api-access-cpcq6\") pod \"neutron-1264-account-create-update-djvfr\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.472407 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run-ovn\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.472467 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-log-ovn\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.472495 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vrrd\" (UniqueName: \"kubernetes.io/projected/84ae179f-3d33-4d6d-ab15-19bba068bf5b-kube-api-access-8vrrd\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.472518 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-additional-scripts\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.472579 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.472609 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-scripts\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.510663 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.574772 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run-ovn\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.574824 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-log-ovn\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.574848 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vrrd\" (UniqueName: \"kubernetes.io/projected/84ae179f-3d33-4d6d-ab15-19bba068bf5b-kube-api-access-8vrrd\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.574868 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-additional-scripts\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.574914 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.574937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-scripts\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.577190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run-ovn\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.577282 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-log-ovn\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.578710 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.589437 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-additional-scripts\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.590070 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-scripts\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.603047 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vrrd\" (UniqueName: \"kubernetes.io/projected/84ae179f-3d33-4d6d-ab15-19bba068bf5b-kube-api-access-8vrrd\") pod \"ovn-controller-djh4f-config-xxpzv\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.724864 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.841033 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j69tq"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.909244 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rx99t"] Mar 18 12:31:20 crc kubenswrapper[4921]: I0318 12:31:20.953420 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4d6e-account-create-update-ksj2x"] Mar 18 12:31:20 crc kubenswrapper[4921]: W0318 12:31:20.967914 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod865b5e1d_7747_4cc6_b9fb_e65784799085.slice/crio-78929358327a22662ead5da3093025b2540c3796f67aa4fbfc88a9f80a65ea75 WatchSource:0}: Error finding container 78929358327a22662ead5da3093025b2540c3796f67aa4fbfc88a9f80a65ea75: Status 404 returned error can't find the container with id 78929358327a22662ead5da3093025b2540c3796f67aa4fbfc88a9f80a65ea75 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.114578 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2rjxw"] Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.126386 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-fqckd"] Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.168078 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a2f6-account-create-update-229zv"] Mar 18 12:31:21 crc kubenswrapper[4921]: W0318 12:31:21.189287 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddceba139_8e2e_4533_b22c_08d898ffadb5.slice/crio-addc28e416ab6d6eeb385c17a89e83fcf817cbdaac74844c18d11956511b5fa2 WatchSource:0}: Error finding container addc28e416ab6d6eeb385c17a89e83fcf817cbdaac74844c18d11956511b5fa2: Status 404 returned error can't find the container with id addc28e416ab6d6eeb385c17a89e83fcf817cbdaac74844c18d11956511b5fa2 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.235586 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd8cfc7-e37a-4115-90a4-d3df984a12b4" path="/var/lib/kubelet/pods/3fd8cfc7-e37a-4115-90a4-d3df984a12b4/volumes" Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.374794 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1264-account-create-update-djvfr"] Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.401236 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-djh4f-config-xxpzv"] Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.787461 4921 generic.go:334] "Generic (PLEG): container finished" podID="865b5e1d-7747-4cc6-b9fb-e65784799085" containerID="445eccbb89655c23ff7607f6269036df4f7a8f07bbb8c39a52a981e3a42cf9b8" exitCode=0 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.787548 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4d6e-account-create-update-ksj2x" event={"ID":"865b5e1d-7747-4cc6-b9fb-e65784799085","Type":"ContainerDied","Data":"445eccbb89655c23ff7607f6269036df4f7a8f07bbb8c39a52a981e3a42cf9b8"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.787614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4d6e-account-create-update-ksj2x" event={"ID":"865b5e1d-7747-4cc6-b9fb-e65784799085","Type":"ContainerStarted","Data":"78929358327a22662ead5da3093025b2540c3796f67aa4fbfc88a9f80a65ea75"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.792998 4921 generic.go:334] "Generic (PLEG): container finished" podID="09aa0ac1-8a57-4e30-b283-8a00711fa9df" containerID="54415727870d7e61613d52c2f8e855664216756534f5319c4207a9e48b8464c6" exitCode=0 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.793070 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a2f6-account-create-update-229zv" event={"ID":"09aa0ac1-8a57-4e30-b283-8a00711fa9df","Type":"ContainerDied","Data":"54415727870d7e61613d52c2f8e855664216756534f5319c4207a9e48b8464c6"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.793119 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a2f6-account-create-update-229zv" event={"ID":"09aa0ac1-8a57-4e30-b283-8a00711fa9df","Type":"ContainerStarted","Data":"b3b98f9e4acbd2a988b420844e62fd81699d0e77add6195559e833d1d5532844"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.794688 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1264-account-create-update-djvfr" event={"ID":"412d3034-b9e8-403d-a293-6c7ed02a7751","Type":"ContainerStarted","Data":"a5708f6e0dbe18a09528014e24acca24470bc042f5404c8cadb04e8c72a62f8f"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.797159 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fqckd" event={"ID":"dceba139-8e2e-4533-b22c-08d898ffadb5","Type":"ContainerStarted","Data":"addc28e416ab6d6eeb385c17a89e83fcf817cbdaac74844c18d11956511b5fa2"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.808958 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-djh4f" Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.810605 4921 generic.go:334] "Generic (PLEG): container finished" podID="19620ae3-0817-4f27-a363-c57b6b7a0a99" containerID="9676f7e73a17d328518232be535bd6ebc94a2f33d2ffbee78fe37fbd93328b9f" exitCode=0 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.810673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j69tq" event={"ID":"19620ae3-0817-4f27-a363-c57b6b7a0a99","Type":"ContainerDied","Data":"9676f7e73a17d328518232be535bd6ebc94a2f33d2ffbee78fe37fbd93328b9f"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.810700 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j69tq" event={"ID":"19620ae3-0817-4f27-a363-c57b6b7a0a99","Type":"ContainerStarted","Data":"fa6727d7e38d4f73e0218bc3a9d21a5c122e96bded97b2349003c8b4920d92d3"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.822718 4921 generic.go:334] "Generic (PLEG): container finished" podID="2164592b-7066-42f2-a9f8-f87f2b3eb19e" containerID="bbe5d5b664d93ae46f5d4fb46c965b28251261f3bb035ef27691c9ea1dc9e6e2" exitCode=0 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.822840 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2rjxw" event={"ID":"2164592b-7066-42f2-a9f8-f87f2b3eb19e","Type":"ContainerDied","Data":"bbe5d5b664d93ae46f5d4fb46c965b28251261f3bb035ef27691c9ea1dc9e6e2"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.822869 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2rjxw" event={"ID":"2164592b-7066-42f2-a9f8-f87f2b3eb19e","Type":"ContainerStarted","Data":"7707967f9c1f642c98725cbecba9caf3e6503ecb7445058488a89e70437a2799"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.830297 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" containerID="57a6a812f70d6c791d7b79a7c6b59ceeb4376b6b25d705266a3ac24cd61d8dbb" exitCode=0 Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.830388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rx99t" event={"ID":"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9","Type":"ContainerDied","Data":"57a6a812f70d6c791d7b79a7c6b59ceeb4376b6b25d705266a3ac24cd61d8dbb"} Mar 18 12:31:21 crc kubenswrapper[4921]: I0318 12:31:21.830416 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rx99t" event={"ID":"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9","Type":"ContainerStarted","Data":"ffc1fc96a2b0a1a64025c6605a32401b05c3b04ade3d3c07b1b143e91471236e"} Mar 18 12:31:21 crc kubenswrapper[4921]: W0318 12:31:21.961957 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84ae179f_3d33_4d6d_ab15_19bba068bf5b.slice/crio-1f613dca6e23937f7fbf89c17be2a934ef490a26ddcc468bd4e00a65bac9540f WatchSource:0}: Error finding container 1f613dca6e23937f7fbf89c17be2a934ef490a26ddcc468bd4e00a65bac9540f: Status 404 returned error can't find the container with id 1f613dca6e23937f7fbf89c17be2a934ef490a26ddcc468bd4e00a65bac9540f Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.923991 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"1c245944ddb7cd5c122c6cc477fd4b8c17707a0b034cb749ccf88bd64991b476"} Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.924575 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"f44537122f931a3e66acf483b594422f9af64976005b3c0018487d261e996304"} Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.924586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"1f7ae1ab24fc1064033fd0503c3706bdd8dbdc4d41ba5cab405e7ab75a73598f"} Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.926170 4921 generic.go:334] "Generic (PLEG): container finished" podID="84ae179f-3d33-4d6d-ab15-19bba068bf5b" containerID="4e883247d897ebfa3215a4f104abf963ac3aa61acf83b7b4894e7ae109b37711" exitCode=0 Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.926228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f-config-xxpzv" event={"ID":"84ae179f-3d33-4d6d-ab15-19bba068bf5b","Type":"ContainerDied","Data":"4e883247d897ebfa3215a4f104abf963ac3aa61acf83b7b4894e7ae109b37711"} Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.926253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f-config-xxpzv" event={"ID":"84ae179f-3d33-4d6d-ab15-19bba068bf5b","Type":"ContainerStarted","Data":"1f613dca6e23937f7fbf89c17be2a934ef490a26ddcc468bd4e00a65bac9540f"} Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.944374 4921 generic.go:334] "Generic (PLEG): container finished" podID="412d3034-b9e8-403d-a293-6c7ed02a7751" containerID="7842ac3232e3505695ea73d27b40d5b85b9b18985de261336a40c67658b41cec" exitCode=0 Mar 18 12:31:22 crc kubenswrapper[4921]: I0318 12:31:22.944634 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1264-account-create-update-djvfr" event={"ID":"412d3034-b9e8-403d-a293-6c7ed02a7751","Type":"ContainerDied","Data":"7842ac3232e3505695ea73d27b40d5b85b9b18985de261336a40c67658b41cec"} Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.441341 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.474061 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.482239 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.488346 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.531435 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.550734 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nn52\" (UniqueName: \"kubernetes.io/projected/09aa0ac1-8a57-4e30-b283-8a00711fa9df-kube-api-access-8nn52\") pod \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552172 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml582\" (UniqueName: \"kubernetes.io/projected/865b5e1d-7747-4cc6-b9fb-e65784799085-kube-api-access-ml582\") pod \"865b5e1d-7747-4cc6-b9fb-e65784799085\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552424 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19620ae3-0817-4f27-a363-c57b6b7a0a99-operator-scripts\") pod \"19620ae3-0817-4f27-a363-c57b6b7a0a99\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552458 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fksr6\" (UniqueName: \"kubernetes.io/projected/19620ae3-0817-4f27-a363-c57b6b7a0a99-kube-api-access-fksr6\") pod \"19620ae3-0817-4f27-a363-c57b6b7a0a99\" (UID: \"19620ae3-0817-4f27-a363-c57b6b7a0a99\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552472 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09aa0ac1-8a57-4e30-b283-8a00711fa9df-operator-scripts\") pod \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\" (UID: \"09aa0ac1-8a57-4e30-b283-8a00711fa9df\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552497 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/865b5e1d-7747-4cc6-b9fb-e65784799085-operator-scripts\") pod \"865b5e1d-7747-4cc6-b9fb-e65784799085\" (UID: \"865b5e1d-7747-4cc6-b9fb-e65784799085\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552640 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmtmm\" (UniqueName: \"kubernetes.io/projected/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-kube-api-access-kmtmm\") pod \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.552677 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-operator-scripts\") pod \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\" (UID: \"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.555337 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865b5e1d-7747-4cc6-b9fb-e65784799085-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "865b5e1d-7747-4cc6-b9fb-e65784799085" (UID: "865b5e1d-7747-4cc6-b9fb-e65784799085"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.556607 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09aa0ac1-8a57-4e30-b283-8a00711fa9df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09aa0ac1-8a57-4e30-b283-8a00711fa9df" (UID: "09aa0ac1-8a57-4e30-b283-8a00711fa9df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.557120 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" (UID: "c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.557313 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19620ae3-0817-4f27-a363-c57b6b7a0a99-kube-api-access-fksr6" (OuterVolumeSpecName: "kube-api-access-fksr6") pod "19620ae3-0817-4f27-a363-c57b6b7a0a99" (UID: "19620ae3-0817-4f27-a363-c57b6b7a0a99"). InnerVolumeSpecName "kube-api-access-fksr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.557335 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09aa0ac1-8a57-4e30-b283-8a00711fa9df-kube-api-access-8nn52" (OuterVolumeSpecName: "kube-api-access-8nn52") pod "09aa0ac1-8a57-4e30-b283-8a00711fa9df" (UID: "09aa0ac1-8a57-4e30-b283-8a00711fa9df"). InnerVolumeSpecName "kube-api-access-8nn52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.557748 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19620ae3-0817-4f27-a363-c57b6b7a0a99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19620ae3-0817-4f27-a363-c57b6b7a0a99" (UID: "19620ae3-0817-4f27-a363-c57b6b7a0a99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.560189 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-kube-api-access-kmtmm" (OuterVolumeSpecName: "kube-api-access-kmtmm") pod "c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" (UID: "c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9"). InnerVolumeSpecName "kube-api-access-kmtmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.562897 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865b5e1d-7747-4cc6-b9fb-e65784799085-kube-api-access-ml582" (OuterVolumeSpecName: "kube-api-access-ml582") pod "865b5e1d-7747-4cc6-b9fb-e65784799085" (UID: "865b5e1d-7747-4cc6-b9fb-e65784799085"). InnerVolumeSpecName "kube-api-access-ml582". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.656670 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjd7p\" (UniqueName: \"kubernetes.io/projected/2164592b-7066-42f2-a9f8-f87f2b3eb19e-kube-api-access-mjd7p\") pod \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.656750 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164592b-7066-42f2-a9f8-f87f2b3eb19e-operator-scripts\") pod \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\" (UID: \"2164592b-7066-42f2-a9f8-f87f2b3eb19e\") " Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657083 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmtmm\" (UniqueName: \"kubernetes.io/projected/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-kube-api-access-kmtmm\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657095 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657106 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nn52\" (UniqueName: \"kubernetes.io/projected/09aa0ac1-8a57-4e30-b283-8a00711fa9df-kube-api-access-8nn52\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657136 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml582\" (UniqueName: \"kubernetes.io/projected/865b5e1d-7747-4cc6-b9fb-e65784799085-kube-api-access-ml582\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657146 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19620ae3-0817-4f27-a363-c57b6b7a0a99-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657183 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fksr6\" (UniqueName: \"kubernetes.io/projected/19620ae3-0817-4f27-a363-c57b6b7a0a99-kube-api-access-fksr6\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657196 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09aa0ac1-8a57-4e30-b283-8a00711fa9df-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657206 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/865b5e1d-7747-4cc6-b9fb-e65784799085-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.657557 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2164592b-7066-42f2-a9f8-f87f2b3eb19e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2164592b-7066-42f2-a9f8-f87f2b3eb19e" (UID: "2164592b-7066-42f2-a9f8-f87f2b3eb19e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.661549 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2164592b-7066-42f2-a9f8-f87f2b3eb19e-kube-api-access-mjd7p" (OuterVolumeSpecName: "kube-api-access-mjd7p") pod "2164592b-7066-42f2-a9f8-f87f2b3eb19e" (UID: "2164592b-7066-42f2-a9f8-f87f2b3eb19e"). InnerVolumeSpecName "kube-api-access-mjd7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.758889 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjd7p\" (UniqueName: \"kubernetes.io/projected/2164592b-7066-42f2-a9f8-f87f2b3eb19e-kube-api-access-mjd7p\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.758915 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164592b-7066-42f2-a9f8-f87f2b3eb19e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.954772 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j69tq" event={"ID":"19620ae3-0817-4f27-a363-c57b6b7a0a99","Type":"ContainerDied","Data":"fa6727d7e38d4f73e0218bc3a9d21a5c122e96bded97b2349003c8b4920d92d3"} Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.954820 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6727d7e38d4f73e0218bc3a9d21a5c122e96bded97b2349003c8b4920d92d3" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.954880 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69tq" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.961133 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2rjxw" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.961134 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2rjxw" event={"ID":"2164592b-7066-42f2-a9f8-f87f2b3eb19e","Type":"ContainerDied","Data":"7707967f9c1f642c98725cbecba9caf3e6503ecb7445058488a89e70437a2799"} Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.961257 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7707967f9c1f642c98725cbecba9caf3e6503ecb7445058488a89e70437a2799" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.963055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rx99t" event={"ID":"c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9","Type":"ContainerDied","Data":"ffc1fc96a2b0a1a64025c6605a32401b05c3b04ade3d3c07b1b143e91471236e"} Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.963071 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rx99t" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.963082 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc1fc96a2b0a1a64025c6605a32401b05c3b04ade3d3c07b1b143e91471236e" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.965185 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d6e-account-create-update-ksj2x" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.965194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4d6e-account-create-update-ksj2x" event={"ID":"865b5e1d-7747-4cc6-b9fb-e65784799085","Type":"ContainerDied","Data":"78929358327a22662ead5da3093025b2540c3796f67aa4fbfc88a9f80a65ea75"} Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.965233 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78929358327a22662ead5da3093025b2540c3796f67aa4fbfc88a9f80a65ea75" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.966526 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a2f6-account-create-update-229zv" event={"ID":"09aa0ac1-8a57-4e30-b283-8a00711fa9df","Type":"ContainerDied","Data":"b3b98f9e4acbd2a988b420844e62fd81699d0e77add6195559e833d1d5532844"} Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.966553 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b98f9e4acbd2a988b420844e62fd81699d0e77add6195559e833d1d5532844" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.966614 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a2f6-account-create-update-229zv" Mar 18 12:31:23 crc kubenswrapper[4921]: I0318 12:31:23.973343 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"6ce294b06257e3ab13597923a81988d1346d5378dc3751bcc0f9a4ac4134d520"} Mar 18 12:31:26 crc kubenswrapper[4921]: I0318 12:31:26.014098 4921 generic.go:334] "Generic (PLEG): container finished" podID="33ebd4aa-2278-4794-a26d-a26333a7fae3" containerID="b97374f116bf95be9e05efb89dd70d18a2c903b071e0c70977ebb9bf22f1006d" exitCode=0 Mar 18 12:31:26 crc kubenswrapper[4921]: I0318 12:31:26.014147 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fcnk6" event={"ID":"33ebd4aa-2278-4794-a26d-a26333a7fae3","Type":"ContainerDied","Data":"b97374f116bf95be9e05efb89dd70d18a2c903b071e0c70977ebb9bf22f1006d"} Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.319145 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.351880 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.370753 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426691 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-scripts\") pod \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426762 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-log-ovn\") pod \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426790 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-db-sync-config-data\") pod \"33ebd4aa-2278-4794-a26d-a26333a7fae3\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426843 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/412d3034-b9e8-403d-a293-6c7ed02a7751-operator-scripts\") pod \"412d3034-b9e8-403d-a293-6c7ed02a7751\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426860 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-config-data\") pod \"33ebd4aa-2278-4794-a26d-a26333a7fae3\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426907 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-additional-scripts\") pod \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426926 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjjjb\" (UniqueName: \"kubernetes.io/projected/33ebd4aa-2278-4794-a26d-a26333a7fae3-kube-api-access-wjjjb\") pod \"33ebd4aa-2278-4794-a26d-a26333a7fae3\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426941 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-combined-ca-bundle\") pod \"33ebd4aa-2278-4794-a26d-a26333a7fae3\" (UID: \"33ebd4aa-2278-4794-a26d-a26333a7fae3\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426965 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpcq6\" (UniqueName: \"kubernetes.io/projected/412d3034-b9e8-403d-a293-6c7ed02a7751-kube-api-access-cpcq6\") pod \"412d3034-b9e8-403d-a293-6c7ed02a7751\" (UID: \"412d3034-b9e8-403d-a293-6c7ed02a7751\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.426988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run\") pod \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.427012 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vrrd\" (UniqueName: \"kubernetes.io/projected/84ae179f-3d33-4d6d-ab15-19bba068bf5b-kube-api-access-8vrrd\") pod \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.427033 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run-ovn\") pod \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\" (UID: \"84ae179f-3d33-4d6d-ab15-19bba068bf5b\") " Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.427607 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run" (OuterVolumeSpecName: "var-run") pod "84ae179f-3d33-4d6d-ab15-19bba068bf5b" (UID: "84ae179f-3d33-4d6d-ab15-19bba068bf5b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.427798 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.427827 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "84ae179f-3d33-4d6d-ab15-19bba068bf5b" (UID: "84ae179f-3d33-4d6d-ab15-19bba068bf5b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.428166 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-scripts" (OuterVolumeSpecName: "scripts") pod "84ae179f-3d33-4d6d-ab15-19bba068bf5b" (UID: "84ae179f-3d33-4d6d-ab15-19bba068bf5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.428837 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/412d3034-b9e8-403d-a293-6c7ed02a7751-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "412d3034-b9e8-403d-a293-6c7ed02a7751" (UID: "412d3034-b9e8-403d-a293-6c7ed02a7751"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.428864 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "84ae179f-3d33-4d6d-ab15-19bba068bf5b" (UID: "84ae179f-3d33-4d6d-ab15-19bba068bf5b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.429236 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "84ae179f-3d33-4d6d-ab15-19bba068bf5b" (UID: "84ae179f-3d33-4d6d-ab15-19bba068bf5b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.432348 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ebd4aa-2278-4794-a26d-a26333a7fae3-kube-api-access-wjjjb" (OuterVolumeSpecName: "kube-api-access-wjjjb") pod "33ebd4aa-2278-4794-a26d-a26333a7fae3" (UID: "33ebd4aa-2278-4794-a26d-a26333a7fae3"). InnerVolumeSpecName "kube-api-access-wjjjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.433077 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412d3034-b9e8-403d-a293-6c7ed02a7751-kube-api-access-cpcq6" (OuterVolumeSpecName: "kube-api-access-cpcq6") pod "412d3034-b9e8-403d-a293-6c7ed02a7751" (UID: "412d3034-b9e8-403d-a293-6c7ed02a7751"). InnerVolumeSpecName "kube-api-access-cpcq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.433216 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "33ebd4aa-2278-4794-a26d-a26333a7fae3" (UID: "33ebd4aa-2278-4794-a26d-a26333a7fae3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.433975 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ae179f-3d33-4d6d-ab15-19bba068bf5b-kube-api-access-8vrrd" (OuterVolumeSpecName: "kube-api-access-8vrrd") pod "84ae179f-3d33-4d6d-ab15-19bba068bf5b" (UID: "84ae179f-3d33-4d6d-ab15-19bba068bf5b"). InnerVolumeSpecName "kube-api-access-8vrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.452374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ebd4aa-2278-4794-a26d-a26333a7fae3" (UID: "33ebd4aa-2278-4794-a26d-a26333a7fae3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.471772 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-config-data" (OuterVolumeSpecName: "config-data") pod "33ebd4aa-2278-4794-a26d-a26333a7fae3" (UID: "33ebd4aa-2278-4794-a26d-a26333a7fae3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.535965 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/412d3034-b9e8-403d-a293-6c7ed02a7751-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.535999 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536011 4921 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536023 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjjjb\" (UniqueName: \"kubernetes.io/projected/33ebd4aa-2278-4794-a26d-a26333a7fae3-kube-api-access-wjjjb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536035 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536045 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpcq6\" (UniqueName: \"kubernetes.io/projected/412d3034-b9e8-403d-a293-6c7ed02a7751-kube-api-access-cpcq6\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536057 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vrrd\" (UniqueName: \"kubernetes.io/projected/84ae179f-3d33-4d6d-ab15-19bba068bf5b-kube-api-access-8vrrd\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536067 4921 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536077 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84ae179f-3d33-4d6d-ab15-19bba068bf5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536086 4921 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84ae179f-3d33-4d6d-ab15-19bba068bf5b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:27 crc kubenswrapper[4921]: I0318 12:31:27.536099 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33ebd4aa-2278-4794-a26d-a26333a7fae3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.035639 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fcnk6" event={"ID":"33ebd4aa-2278-4794-a26d-a26333a7fae3","Type":"ContainerDied","Data":"eb3940341843436ab23153be668475b4b95b02057fdaed6f95b762f450628168"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.036000 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb3940341843436ab23153be668475b4b95b02057fdaed6f95b762f450628168" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.035655 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fcnk6" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.041363 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"a68d70fd22c882e995ded0c62216a18073ba6612f41913c598c593d06c61a6b2"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.041492 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"0ab56600506809bec91a2b7ae6b9bf4d001cdb5c75b88b21a9af00d3e3d40e90"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.041553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"fbed1f40b33a5fa1094364d62d483881bd04228924310bd51d1435c0c89e479b"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.041627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"6fa2fcae87945f7dd516860dd504658f8b9dc554af18972aba630feda6408da7"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.043390 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fqckd" event={"ID":"dceba139-8e2e-4533-b22c-08d898ffadb5","Type":"ContainerStarted","Data":"9ecd0296879ad17c999545066ef81fa6224bf76e4e315c77269829322f92fc94"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.045522 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f-config-xxpzv" event={"ID":"84ae179f-3d33-4d6d-ab15-19bba068bf5b","Type":"ContainerDied","Data":"1f613dca6e23937f7fbf89c17be2a934ef490a26ddcc468bd4e00a65bac9540f"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.045616 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f613dca6e23937f7fbf89c17be2a934ef490a26ddcc468bd4e00a65bac9540f" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.045708 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f-config-xxpzv" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.051708 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1264-account-create-update-djvfr" event={"ID":"412d3034-b9e8-403d-a293-6c7ed02a7751","Type":"ContainerDied","Data":"a5708f6e0dbe18a09528014e24acca24470bc042f5404c8cadb04e8c72a62f8f"} Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.051755 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5708f6e0dbe18a09528014e24acca24470bc042f5404c8cadb04e8c72a62f8f" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.051856 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1264-account-create-update-djvfr" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.064390 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-fqckd" podStartSLOduration=3.120515443 podStartE2EDuration="9.064374743s" podCreationTimestamp="2026-03-18 12:31:19 +0000 UTC" firstStartedPulling="2026-03-18 12:31:21.205660942 +0000 UTC m=+1300.755581581" lastFinishedPulling="2026-03-18 12:31:27.149520232 +0000 UTC m=+1306.699440881" observedRunningTime="2026-03-18 12:31:28.06321389 +0000 UTC m=+1307.613134549" watchObservedRunningTime="2026-03-18 12:31:28.064374743 +0000 UTC m=+1307.614295382" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.356077 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412d3034_b9e8_403d_a293_6c7ed02a7751.slice/crio-a5708f6e0dbe18a09528014e24acca24470bc042f5404c8cadb04e8c72a62f8f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412d3034_b9e8_403d_a293_6c7ed02a7751.slice\": RecentStats: unable to find data in memory cache]" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.453292 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-djh4f-config-xxpzv"] Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.464310 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-djh4f-config-xxpzv"] Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.490318 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-nbsg8"] Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.491376 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aa0ac1-8a57-4e30-b283-8a00711fa9df" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.491456 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aa0ac1-8a57-4e30-b283-8a00711fa9df" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.491528 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.491576 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.491629 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2164592b-7066-42f2-a9f8-f87f2b3eb19e" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.491691 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2164592b-7066-42f2-a9f8-f87f2b3eb19e" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.491752 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412d3034-b9e8-403d-a293-6c7ed02a7751" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.491798 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="412d3034-b9e8-403d-a293-6c7ed02a7751" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.491854 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865b5e1d-7747-4cc6-b9fb-e65784799085" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.491900 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="865b5e1d-7747-4cc6-b9fb-e65784799085" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.491951 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ae179f-3d33-4d6d-ab15-19bba068bf5b" containerName="ovn-config" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.492003 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ae179f-3d33-4d6d-ab15-19bba068bf5b" containerName="ovn-config" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.492173 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19620ae3-0817-4f27-a363-c57b6b7a0a99" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.492224 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="19620ae3-0817-4f27-a363-c57b6b7a0a99" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: E0318 12:31:28.492277 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ebd4aa-2278-4794-a26d-a26333a7fae3" containerName="glance-db-sync" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.492331 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ebd4aa-2278-4794-a26d-a26333a7fae3" containerName="glance-db-sync" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493150 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="865b5e1d-7747-4cc6-b9fb-e65784799085" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493307 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2164592b-7066-42f2-a9f8-f87f2b3eb19e" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493386 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="09aa0ac1-8a57-4e30-b283-8a00711fa9df" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493445 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493494 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="412d3034-b9e8-403d-a293-6c7ed02a7751" containerName="mariadb-account-create-update" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493555 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="19620ae3-0817-4f27-a363-c57b6b7a0a99" containerName="mariadb-database-create" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493607 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ebd4aa-2278-4794-a26d-a26333a7fae3" containerName="glance-db-sync" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.493658 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ae179f-3d33-4d6d-ab15-19bba068bf5b" containerName="ovn-config" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.494614 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.501821 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-nbsg8"] Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.556034 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.556152 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-config\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.556183 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.556280 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.556311 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8blnd\" (UniqueName: \"kubernetes.io/projected/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-kube-api-access-8blnd\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.657464 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.657559 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.657579 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8blnd\" (UniqueName: \"kubernetes.io/projected/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-kube-api-access-8blnd\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.657637 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.657664 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-config\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.658419 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.658448 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-config\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.658958 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.659237 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.678299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8blnd\" (UniqueName: \"kubernetes.io/projected/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-kube-api-access-8blnd\") pod \"dnsmasq-dns-5b946c75cc-nbsg8\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:28 crc kubenswrapper[4921]: I0318 12:31:28.825688 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.071879 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"5417c0aaf863e518433565af42abbad6a0c5b335eef0766c35d94f92e5627f39"} Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.071937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"d92a64abd1a53e46dcafdafcc8d4d1c74904044d5ba50721426f103d435d57d1"} Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.071950 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerStarted","Data":"e1f5e00485fe3c35e3ec69acbb2f60126c30dad072a5acf86f531dc05351e016"} Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.137221 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.475872773 podStartE2EDuration="46.137197464s" podCreationTimestamp="2026-03-18 12:30:43 +0000 UTC" firstStartedPulling="2026-03-18 12:31:17.580821521 +0000 UTC m=+1297.130742160" lastFinishedPulling="2026-03-18 12:31:24.242146212 +0000 UTC m=+1303.792066851" observedRunningTime="2026-03-18 12:31:29.11594566 +0000 UTC m=+1308.665866309" watchObservedRunningTime="2026-03-18 12:31:29.137197464 +0000 UTC m=+1308.687118103" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.218427 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ae179f-3d33-4d6d-ab15-19bba068bf5b" path="/var/lib/kubelet/pods/84ae179f-3d33-4d6d-ab15-19bba068bf5b/volumes" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.330637 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-nbsg8"] Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.507425 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-nbsg8"] Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.540908 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7v9cz"] Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.542408 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.545212 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.560898 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7v9cz"] Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.579070 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.579143 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.579172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-config\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.579210 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7zxk\" (UniqueName: \"kubernetes.io/projected/02493287-86a0-4bc1-9ddc-c808a810ae1c-kube-api-access-m7zxk\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.579248 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.579270 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.680093 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.680189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.680231 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.680253 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-config\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.680296 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7zxk\" (UniqueName: \"kubernetes.io/projected/02493287-86a0-4bc1-9ddc-c808a810ae1c-kube-api-access-m7zxk\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.680331 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.681150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.681614 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.682429 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.682545 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-config\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.682729 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.707349 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7zxk\" (UniqueName: \"kubernetes.io/projected/02493287-86a0-4bc1-9ddc-c808a810ae1c-kube-api-access-m7zxk\") pod \"dnsmasq-dns-74f6bcbc87-7v9cz\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:29 crc kubenswrapper[4921]: I0318 12:31:29.866591 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.088571 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" event={"ID":"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b","Type":"ContainerDied","Data":"5bf365080213412fffdf4b97198b07765be45489075a816e68b9639c266eb2e7"} Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.088423 4921 generic.go:334] "Generic (PLEG): container finished" podID="bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" containerID="5bf365080213412fffdf4b97198b07765be45489075a816e68b9639c266eb2e7" exitCode=0 Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.093162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" event={"ID":"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b","Type":"ContainerStarted","Data":"a622e41c4dfa7f56152f1dd36bd6cfd5b4d1380ebca6ae93d93b06073af6ee3f"} Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.340883 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7v9cz"] Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.502212 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.608436 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-dns-svc\") pod \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.610074 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-config\") pod \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.610527 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-nb\") pod \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.610747 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-sb\") pod \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.610922 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8blnd\" (UniqueName: \"kubernetes.io/projected/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-kube-api-access-8blnd\") pod \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\" (UID: \"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b\") " Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.617187 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-kube-api-access-8blnd" (OuterVolumeSpecName: "kube-api-access-8blnd") pod "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" (UID: "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b"). InnerVolumeSpecName "kube-api-access-8blnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.632099 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" (UID: "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.633995 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" (UID: "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.635675 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-config" (OuterVolumeSpecName: "config") pod "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" (UID: "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.642899 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" (UID: "bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.712835 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.713084 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.713184 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.713285 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8blnd\" (UniqueName: \"kubernetes.io/projected/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-kube-api-access-8blnd\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:30 crc kubenswrapper[4921]: I0318 12:31:30.713360 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:31 crc kubenswrapper[4921]: I0318 12:31:31.105589 4921 generic.go:334] "Generic (PLEG): container finished" podID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerID="824de2ffe0111817984e15e57889da1f4e07328f47213ad070063d9892e50c02" exitCode=0 Mar 18 12:31:31 crc kubenswrapper[4921]: I0318 12:31:31.105657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" event={"ID":"02493287-86a0-4bc1-9ddc-c808a810ae1c","Type":"ContainerDied","Data":"824de2ffe0111817984e15e57889da1f4e07328f47213ad070063d9892e50c02"} Mar 18 12:31:31 crc kubenswrapper[4921]: I0318 12:31:31.106048 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" event={"ID":"02493287-86a0-4bc1-9ddc-c808a810ae1c","Type":"ContainerStarted","Data":"ee9ba0ce5d2f5a37c500cddd95e0382fe1c369b0ebe04724bb8340d726bfd3c7"} Mar 18 12:31:31 crc kubenswrapper[4921]: I0318 12:31:31.110349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" event={"ID":"bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b","Type":"ContainerDied","Data":"a622e41c4dfa7f56152f1dd36bd6cfd5b4d1380ebca6ae93d93b06073af6ee3f"} Mar 18 12:31:31 crc kubenswrapper[4921]: I0318 12:31:31.110405 4921 scope.go:117] "RemoveContainer" containerID="5bf365080213412fffdf4b97198b07765be45489075a816e68b9639c266eb2e7" Mar 18 12:31:31 crc kubenswrapper[4921]: I0318 12:31:31.110548 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:31:32 crc kubenswrapper[4921]: I0318 12:31:32.121826 4921 generic.go:334] "Generic (PLEG): container finished" podID="dceba139-8e2e-4533-b22c-08d898ffadb5" containerID="9ecd0296879ad17c999545066ef81fa6224bf76e4e315c77269829322f92fc94" exitCode=0 Mar 18 12:31:32 crc kubenswrapper[4921]: I0318 12:31:32.121932 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fqckd" event={"ID":"dceba139-8e2e-4533-b22c-08d898ffadb5","Type":"ContainerDied","Data":"9ecd0296879ad17c999545066ef81fa6224bf76e4e315c77269829322f92fc94"} Mar 18 12:31:32 crc kubenswrapper[4921]: I0318 12:31:32.125799 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" event={"ID":"02493287-86a0-4bc1-9ddc-c808a810ae1c","Type":"ContainerStarted","Data":"77d8a8cbf0bd7c73d8fdec7de16c28655e50165c4f38ff2070ec3b204d6c8049"} Mar 18 12:31:32 crc kubenswrapper[4921]: I0318 12:31:32.126146 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:32 crc kubenswrapper[4921]: I0318 12:31:32.161159 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" podStartSLOduration=3.161105366 podStartE2EDuration="3.161105366s" podCreationTimestamp="2026-03-18 12:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:32.156136325 +0000 UTC m=+1311.706056974" watchObservedRunningTime="2026-03-18 12:31:32.161105366 +0000 UTC m=+1311.711026005" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.453614 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.566541 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-config-data\") pod \"dceba139-8e2e-4533-b22c-08d898ffadb5\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.566613 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-combined-ca-bundle\") pod \"dceba139-8e2e-4533-b22c-08d898ffadb5\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.566694 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98thr\" (UniqueName: \"kubernetes.io/projected/dceba139-8e2e-4533-b22c-08d898ffadb5-kube-api-access-98thr\") pod \"dceba139-8e2e-4533-b22c-08d898ffadb5\" (UID: \"dceba139-8e2e-4533-b22c-08d898ffadb5\") " Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.576208 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dceba139-8e2e-4533-b22c-08d898ffadb5-kube-api-access-98thr" (OuterVolumeSpecName: "kube-api-access-98thr") pod "dceba139-8e2e-4533-b22c-08d898ffadb5" (UID: "dceba139-8e2e-4533-b22c-08d898ffadb5"). InnerVolumeSpecName "kube-api-access-98thr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.595717 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dceba139-8e2e-4533-b22c-08d898ffadb5" (UID: "dceba139-8e2e-4533-b22c-08d898ffadb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.628968 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-config-data" (OuterVolumeSpecName: "config-data") pod "dceba139-8e2e-4533-b22c-08d898ffadb5" (UID: "dceba139-8e2e-4533-b22c-08d898ffadb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.676913 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.676994 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98thr\" (UniqueName: \"kubernetes.io/projected/dceba139-8e2e-4533-b22c-08d898ffadb5-kube-api-access-98thr\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:33 crc kubenswrapper[4921]: I0318 12:31:33.677013 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceba139-8e2e-4533-b22c-08d898ffadb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.149673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fqckd" event={"ID":"dceba139-8e2e-4533-b22c-08d898ffadb5","Type":"ContainerDied","Data":"addc28e416ab6d6eeb385c17a89e83fcf817cbdaac74844c18d11956511b5fa2"} Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.149956 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="addc28e416ab6d6eeb385c17a89e83fcf817cbdaac74844c18d11956511b5fa2" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.149777 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fqckd" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.412220 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7v9cz"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.412427 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerName="dnsmasq-dns" containerID="cri-o://77d8a8cbf0bd7c73d8fdec7de16c28655e50165c4f38ff2070ec3b204d6c8049" gracePeriod=10 Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.442067 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s9m8l"] Mar 18 12:31:34 crc kubenswrapper[4921]: E0318 12:31:34.442552 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceba139-8e2e-4533-b22c-08d898ffadb5" containerName="keystone-db-sync" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.442569 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceba139-8e2e-4533-b22c-08d898ffadb5" containerName="keystone-db-sync" Mar 18 12:31:34 crc kubenswrapper[4921]: E0318 12:31:34.442599 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" containerName="init" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.442607 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" containerName="init" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.442774 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" containerName="init" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.442804 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceba139-8e2e-4533-b22c-08d898ffadb5" containerName="keystone-db-sync" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.443544 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.449666 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kqlk9" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.452730 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-5bplm"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.453518 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.453782 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.453903 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.454532 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.455993 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.487480 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s9m8l"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.489403 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-config-data\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.489454 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qzh\" (UniqueName: \"kubernetes.io/projected/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-kube-api-access-f5qzh\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.489524 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-fernet-keys\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.489601 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-combined-ca-bundle\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.489643 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-scripts\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.489680 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-credential-keys\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.520092 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-5bplm"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.592375 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdn5j\" (UniqueName: \"kubernetes.io/projected/88102bf2-01c2-4cfe-a798-4deac7803ec0-kube-api-access-wdn5j\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.592769 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.592833 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-config\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.592883 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-combined-ca-bundle\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.592928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-scripts\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.592970 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.593004 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-credential-keys\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.593060 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.593125 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-config-data\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.593164 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qzh\" (UniqueName: \"kubernetes.io/projected/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-kube-api-access-f5qzh\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.593211 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.593237 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-fernet-keys\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.612656 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-combined-ca-bundle\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.613842 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-fernet-keys\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.614156 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-credential-keys\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.617786 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-config-data\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.619683 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-scripts\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.655826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qzh\" (UniqueName: \"kubernetes.io/projected/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-kube-api-access-f5qzh\") pod \"keystone-bootstrap-s9m8l\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.697156 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.698991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-config\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.699085 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.699144 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.699211 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.699251 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdn5j\" (UniqueName: \"kubernetes.io/projected/88102bf2-01c2-4cfe-a798-4deac7803ec0-kube-api-access-wdn5j\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.699283 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.700304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.700931 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-config\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.704695 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.704983 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.705501 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.707131 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.709024 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.719468 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.743415 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.753008 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdn5j\" (UniqueName: \"kubernetes.io/projected/88102bf2-01c2-4cfe-a798-4deac7803ec0-kube-api-access-wdn5j\") pod \"dnsmasq-dns-847c4cc679-5bplm\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.766199 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nszlz"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.767532 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.778375 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.778424 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v6kc7" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.778676 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.785549 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.803981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-config-data\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804082 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-scripts\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804105 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804159 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-run-httpd\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804183 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-log-httpd\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804220 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwcqq\" (UniqueName: \"kubernetes.io/projected/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-kube-api-access-bwcqq\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.804673 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.818021 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nszlz"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.863843 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7wrc2"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.864838 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.880663 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.880961 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6xm7m" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.884057 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.886674 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-5bplm"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913185 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7wrc2"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913307 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-db-sync-config-data\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-combined-ca-bundle\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913382 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2lpz\" (UniqueName: \"kubernetes.io/projected/0022dc9f-31d2-440f-831a-ae0a03c22b63-kube-api-access-j2lpz\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913399 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-combined-ca-bundle\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913422 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-scripts\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913467 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0022dc9f-31d2-440f-831a-ae0a03c22b63-etc-machine-id\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913505 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-run-httpd\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnhkg\" (UniqueName: \"kubernetes.io/projected/c00f5d00-73a8-4268-acc9-49f809cf6d7f-kube-api-access-pnhkg\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913570 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-log-httpd\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913585 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwcqq\" (UniqueName: \"kubernetes.io/projected/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-kube-api-access-bwcqq\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913615 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-config-data\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913629 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-scripts\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-config-data\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.913682 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-config\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.923553 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-scripts\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.924992 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-log-httpd\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.925375 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-run-httpd\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.929154 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.931784 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-config-data\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.937208 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.938669 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ghx6h"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.940354 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.951673 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.951912 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nctw2" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.952007 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ghx6h"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.967268 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-p4hhn"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.969315 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.984883 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-x44rz"] Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.986735 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.988100 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mjwkw" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.988249 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 12:31:34 crc kubenswrapper[4921]: I0318 12:31:34.988399 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.006657 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwcqq\" (UniqueName: \"kubernetes.io/projected/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-kube-api-access-bwcqq\") pod \"ceilometer-0\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " pod="openstack/ceilometer-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.029569 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-x44rz"] Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.029614 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-p4hhn"] Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.031962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-scripts\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-config-data\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-scripts\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032062 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-logs\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032083 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-config\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032125 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcvwl\" (UniqueName: \"kubernetes.io/projected/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-kube-api-access-lcvwl\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032149 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-db-sync-config-data\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032173 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-combined-ca-bundle\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032210 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-combined-ca-bundle\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032244 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2lpz\" (UniqueName: \"kubernetes.io/projected/0022dc9f-31d2-440f-831a-ae0a03c22b63-kube-api-access-j2lpz\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-config-data\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-combined-ca-bundle\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032305 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/b7908371-b4b8-4437-be4a-13b8fccb6a9f-kube-api-access-gt6qk\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032343 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0022dc9f-31d2-440f-831a-ae0a03c22b63-etc-machine-id\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032384 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-combined-ca-bundle\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnhkg\" (UniqueName: \"kubernetes.io/projected/c00f5d00-73a8-4268-acc9-49f809cf6d7f-kube-api-access-pnhkg\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.032460 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-db-sync-config-data\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.034386 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0022dc9f-31d2-440f-831a-ae0a03c22b63-etc-machine-id\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.037990 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-config-data\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.043329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-config\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.044651 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-scripts\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.045167 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-combined-ca-bundle\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.048243 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-db-sync-config-data\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.051767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-combined-ca-bundle\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.122064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2lpz\" (UniqueName: \"kubernetes.io/projected/0022dc9f-31d2-440f-831a-ae0a03c22b63-kube-api-access-j2lpz\") pod \"cinder-db-sync-nszlz\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.129464 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.131978 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nszlz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155213 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-config\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155345 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-combined-ca-bundle\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155446 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155514 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-db-sync-config-data\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155632 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv77c\" (UniqueName: \"kubernetes.io/projected/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-kube-api-access-zv77c\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155681 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155727 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-scripts\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155774 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-logs\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155818 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcvwl\" (UniqueName: \"kubernetes.io/projected/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-kube-api-access-lcvwl\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155853 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-combined-ca-bundle\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155908 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155961 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.155994 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-config-data\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.156042 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/b7908371-b4b8-4437-be4a-13b8fccb6a9f-kube-api-access-gt6qk\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.167753 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnhkg\" (UniqueName: \"kubernetes.io/projected/c00f5d00-73a8-4268-acc9-49f809cf6d7f-kube-api-access-pnhkg\") pod \"neutron-db-sync-7wrc2\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.238266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-scripts\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.253793 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-combined-ca-bundle\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.183782 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-logs\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.269525 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.284730 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-db-sync-config-data\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.287760 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-combined-ca-bundle\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.290245 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.290327 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.290356 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.290405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-config\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.290469 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.290518 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv77c\" (UniqueName: \"kubernetes.io/projected/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-kube-api-access-zv77c\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.298567 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcvwl\" (UniqueName: \"kubernetes.io/projected/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-kube-api-access-lcvwl\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.299951 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/b7908371-b4b8-4437-be4a-13b8fccb6a9f-kube-api-access-gt6qk\") pod \"barbican-db-sync-ghx6h\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.302156 4921 generic.go:334] "Generic (PLEG): container finished" podID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerID="77d8a8cbf0bd7c73d8fdec7de16c28655e50165c4f38ff2070ec3b204d6c8049" exitCode=0 Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.302194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" event={"ID":"02493287-86a0-4bc1-9ddc-c808a810ae1c","Type":"ContainerDied","Data":"77d8a8cbf0bd7c73d8fdec7de16c28655e50165c4f38ff2070ec3b204d6c8049"} Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.304744 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-config-data\") pod \"placement-db-sync-p4hhn\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.308080 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.316724 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.317675 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.317844 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-config\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.322890 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv77c\" (UniqueName: \"kubernetes.io/projected/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-kube-api-access-zv77c\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.323030 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-x44rz\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.407692 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p4hhn" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.541149 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.585499 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.589836 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.605791 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:35 crc kubenswrapper[4921]: E0318 12:31:35.606222 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerName="init" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.606243 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerName="init" Mar 18 12:31:35 crc kubenswrapper[4921]: E0318 12:31:35.606367 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerName="dnsmasq-dns" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.606379 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerName="dnsmasq-dns" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.606604 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" containerName="dnsmasq-dns" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.607688 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.611222 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-svc\") pod \"02493287-86a0-4bc1-9ddc-c808a810ae1c\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.611355 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7zxk\" (UniqueName: \"kubernetes.io/projected/02493287-86a0-4bc1-9ddc-c808a810ae1c-kube-api-access-m7zxk\") pod \"02493287-86a0-4bc1-9ddc-c808a810ae1c\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.611413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-swift-storage-0\") pod \"02493287-86a0-4bc1-9ddc-c808a810ae1c\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.611428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-nb\") pod \"02493287-86a0-4bc1-9ddc-c808a810ae1c\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.611516 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-sb\") pod \"02493287-86a0-4bc1-9ddc-c808a810ae1c\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.611536 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-config\") pod \"02493287-86a0-4bc1-9ddc-c808a810ae1c\" (UID: \"02493287-86a0-4bc1-9ddc-c808a810ae1c\") " Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.622743 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p5jht" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.622807 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.622996 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.623181 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.636564 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02493287-86a0-4bc1-9ddc-c808a810ae1c-kube-api-access-m7zxk" (OuterVolumeSpecName: "kube-api-access-m7zxk") pod "02493287-86a0-4bc1-9ddc-c808a810ae1c" (UID: "02493287-86a0-4bc1-9ddc-c808a810ae1c"). InnerVolumeSpecName "kube-api-access-m7zxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.654779 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.677702 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s9m8l"] Mar 18 12:31:35 crc kubenswrapper[4921]: W0318 12:31:35.689559 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded1c9bae_63a4_4e5d_8c81_3113898f71c7.slice/crio-e6312b8072ca2869aa57e499e58ce87d3ad88e6d4e3181d417eb5b83ebf2b9b7 WatchSource:0}: Error finding container e6312b8072ca2869aa57e499e58ce87d3ad88e6d4e3181d417eb5b83ebf2b9b7: Status 404 returned error can't find the container with id e6312b8072ca2869aa57e499e58ce87d3ad88e6d4e3181d417eb5b83ebf2b9b7 Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.694259 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.695741 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.699087 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.699618 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.713551 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.713857 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.713986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgzfr\" (UniqueName: \"kubernetes.io/projected/c7ecbdd9-b034-4731-b13b-65deed2224aa-kube-api-access-zgzfr\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.714104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.714244 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.714491 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.714600 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-logs\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.715099 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.715456 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7zxk\" (UniqueName: \"kubernetes.io/projected/02493287-86a0-4bc1-9ddc-c808a810ae1c-kube-api-access-m7zxk\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.726949 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02493287-86a0-4bc1-9ddc-c808a810ae1c" (UID: "02493287-86a0-4bc1-9ddc-c808a810ae1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.750048 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02493287-86a0-4bc1-9ddc-c808a810ae1c" (UID: "02493287-86a0-4bc1-9ddc-c808a810ae1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.757736 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.759594 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "02493287-86a0-4bc1-9ddc-c808a810ae1c" (UID: "02493287-86a0-4bc1-9ddc-c808a810ae1c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.774735 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-config" (OuterVolumeSpecName: "config") pod "02493287-86a0-4bc1-9ddc-c808a810ae1c" (UID: "02493287-86a0-4bc1-9ddc-c808a810ae1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:35 crc kubenswrapper[4921]: W0318 12:31:35.794513 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88102bf2_01c2_4cfe_a798_4deac7803ec0.slice/crio-671ebcfb87e292cc1a49696f06d770712d592dccf1fb4e1520b0adee2f896c52 WatchSource:0}: Error finding container 671ebcfb87e292cc1a49696f06d770712d592dccf1fb4e1520b0adee2f896c52: Status 404 returned error can't find the container with id 671ebcfb87e292cc1a49696f06d770712d592dccf1fb4e1520b0adee2f896c52 Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.803235 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02493287-86a0-4bc1-9ddc-c808a810ae1c" (UID: "02493287-86a0-4bc1-9ddc-c808a810ae1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.814082 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-5bplm"] Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816399 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816420 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgzfr\" (UniqueName: \"kubernetes.io/projected/c7ecbdd9-b034-4731-b13b-65deed2224aa-kube-api-access-zgzfr\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816444 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816477 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816520 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816543 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8v8\" (UniqueName: \"kubernetes.io/projected/b2dc6b10-845f-455a-828b-36e6eafc21f4-kube-api-access-wd8v8\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816588 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-logs\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816606 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816662 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816692 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816745 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816782 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816792 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816801 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816810 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.816818 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02493287-86a0-4bc1-9ddc-c808a810ae1c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.818554 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.820322 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.823069 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-logs\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.823677 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.845957 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgzfr\" (UniqueName: \"kubernetes.io/projected/c7ecbdd9-b034-4731-b13b-65deed2224aa-kube-api-access-zgzfr\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.877221 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.878477 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.879428 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920243 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8v8\" (UniqueName: \"kubernetes.io/projected/b2dc6b10-845f-455a-828b-36e6eafc21f4-kube-api-access-wd8v8\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920381 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920531 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920775 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920887 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.920969 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.921018 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.922064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.922266 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.930499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.934954 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.936301 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.937142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.939820 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.957421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.966905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8v8\" (UniqueName: \"kubernetes.io/projected/b2dc6b10-845f-455a-828b-36e6eafc21f4-kube-api-access-wd8v8\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.988491 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:31:35 crc kubenswrapper[4921]: I0318 12:31:35.994477 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nszlz"] Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.135208 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-p4hhn"] Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.141450 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:31:36 crc kubenswrapper[4921]: W0318 12:31:36.145330 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cce4ccd_6a44_4c00_b7f5_9c74946eb308.slice/crio-23a2fac68bcb55d5c66fa45f1af5767b5e3202edfddef79c117ac93631a60edd WatchSource:0}: Error finding container 23a2fac68bcb55d5c66fa45f1af5767b5e3202edfddef79c117ac93631a60edd: Status 404 returned error can't find the container with id 23a2fac68bcb55d5c66fa45f1af5767b5e3202edfddef79c117ac93631a60edd Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.160414 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7wrc2"] Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.174966 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:31:36 crc kubenswrapper[4921]: W0318 12:31:36.175819 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc00f5d00_73a8_4268_acc9_49f809cf6d7f.slice/crio-c0b666e908f82c701c828154dfe5b274321888a58cc70a1a2fb13154f7450ab9 WatchSource:0}: Error finding container c0b666e908f82c701c828154dfe5b274321888a58cc70a1a2fb13154f7450ab9: Status 404 returned error can't find the container with id c0b666e908f82c701c828154dfe5b274321888a58cc70a1a2fb13154f7450ab9 Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.184379 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.320303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p4hhn" event={"ID":"5cce4ccd-6a44-4c00-b7f5-9c74946eb308","Type":"ContainerStarted","Data":"23a2fac68bcb55d5c66fa45f1af5767b5e3202edfddef79c117ac93631a60edd"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.331371 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.331361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-7v9cz" event={"ID":"02493287-86a0-4bc1-9ddc-c808a810ae1c","Type":"ContainerDied","Data":"ee9ba0ce5d2f5a37c500cddd95e0382fe1c369b0ebe04724bb8340d726bfd3c7"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.331555 4921 scope.go:117] "RemoveContainer" containerID="77d8a8cbf0bd7c73d8fdec7de16c28655e50165c4f38ff2070ec3b204d6c8049" Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.336842 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7wrc2" event={"ID":"c00f5d00-73a8-4268-acc9-49f809cf6d7f","Type":"ContainerStarted","Data":"c0b666e908f82c701c828154dfe5b274321888a58cc70a1a2fb13154f7450ab9"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.342190 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nszlz" event={"ID":"0022dc9f-31d2-440f-831a-ae0a03c22b63","Type":"ContainerStarted","Data":"d0e540fa388afd5c99aafa99dcb5c81f899ee7a7dddc7eebc405ae426ccd917d"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.346694 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerStarted","Data":"77e3c960b24d096d19356c1028545fc660992459cebfcd16320749d615d8a19f"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.348664 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" event={"ID":"88102bf2-01c2-4cfe-a798-4deac7803ec0","Type":"ContainerStarted","Data":"671ebcfb87e292cc1a49696f06d770712d592dccf1fb4e1520b0adee2f896c52"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.350013 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9m8l" event={"ID":"ed1c9bae-63a4-4e5d-8c81-3113898f71c7","Type":"ContainerStarted","Data":"e6312b8072ca2869aa57e499e58ce87d3ad88e6d4e3181d417eb5b83ebf2b9b7"} Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.379665 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-x44rz"] Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.387025 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ghx6h"] Mar 18 12:31:36 crc kubenswrapper[4921]: W0318 12:31:36.389626 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7908371_b4b8_4437_be4a_13b8fccb6a9f.slice/crio-4c6e16e7573daea38fba8c9b521c63b389bec9e5a2a7652504ef673f321c378b WatchSource:0}: Error finding container 4c6e16e7573daea38fba8c9b521c63b389bec9e5a2a7652504ef673f321c378b: Status 404 returned error can't find the container with id 4c6e16e7573daea38fba8c9b521c63b389bec9e5a2a7652504ef673f321c378b Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.405980 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7v9cz"] Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.413696 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-7v9cz"] Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.420976 4921 scope.go:117] "RemoveContainer" containerID="824de2ffe0111817984e15e57889da1f4e07328f47213ad070063d9892e50c02" Mar 18 12:31:36 crc kubenswrapper[4921]: W0318 12:31:36.422667 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc6a9d7f_27e2_4c00_8077_6e1044ad20af.slice/crio-fb962b1a42b76bc3e87afe578dcaaee6c19cd8250ca1338eebd0cec9ee4c259d WatchSource:0}: Error finding container fb962b1a42b76bc3e87afe578dcaaee6c19cd8250ca1338eebd0cec9ee4c259d: Status 404 returned error can't find the container with id fb962b1a42b76bc3e87afe578dcaaee6c19cd8250ca1338eebd0cec9ee4c259d Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.829196 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:36 crc kubenswrapper[4921]: W0318 12:31:36.881118 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ecbdd9_b034_4731_b13b_65deed2224aa.slice/crio-95cb2a8c8ee0f43fb859f8ca23a46d176c88838af73f340e008e85642ce66cd7 WatchSource:0}: Error finding container 95cb2a8c8ee0f43fb859f8ca23a46d176c88838af73f340e008e85642ce66cd7: Status 404 returned error can't find the container with id 95cb2a8c8ee0f43fb859f8ca23a46d176c88838af73f340e008e85642ce66cd7 Mar 18 12:31:36 crc kubenswrapper[4921]: I0318 12:31:36.948711 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.024072 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.099411 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.115192 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.230348 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02493287-86a0-4bc1-9ddc-c808a810ae1c" path="/var/lib/kubelet/pods/02493287-86a0-4bc1-9ddc-c808a810ae1c/volumes" Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.402370 4921 generic.go:334] "Generic (PLEG): container finished" podID="88102bf2-01c2-4cfe-a798-4deac7803ec0" containerID="e0968ce981fe39c85e475e4db3a09fc96f64003564249eb6c45312f4a9c19dbc" exitCode=0 Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.402472 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" event={"ID":"88102bf2-01c2-4cfe-a798-4deac7803ec0","Type":"ContainerDied","Data":"e0968ce981fe39c85e475e4db3a09fc96f64003564249eb6c45312f4a9c19dbc"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.414888 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9m8l" event={"ID":"ed1c9bae-63a4-4e5d-8c81-3113898f71c7","Type":"ContainerStarted","Data":"3304fa91e8bbabf4082cc5197900d74b2b6e74d5eb12fcdb90a6a309eb2fc462"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.425207 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b2dc6b10-845f-455a-828b-36e6eafc21f4","Type":"ContainerStarted","Data":"489a53e268288468e9d3596aca8d5681724ac913c2497088eeb8c1495e699e9f"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.441747 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7ecbdd9-b034-4731-b13b-65deed2224aa","Type":"ContainerStarted","Data":"95cb2a8c8ee0f43fb859f8ca23a46d176c88838af73f340e008e85642ce66cd7"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.451750 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s9m8l" podStartSLOduration=3.45171946 podStartE2EDuration="3.45171946s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:37.448935691 +0000 UTC m=+1316.998856340" watchObservedRunningTime="2026-03-18 12:31:37.45171946 +0000 UTC m=+1317.001640099" Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.465219 4921 generic.go:334] "Generic (PLEG): container finished" podID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerID="a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e" exitCode=0 Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.465328 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" event={"ID":"cc6a9d7f-27e2-4c00-8077-6e1044ad20af","Type":"ContainerDied","Data":"a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.465355 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" event={"ID":"cc6a9d7f-27e2-4c00-8077-6e1044ad20af","Type":"ContainerStarted","Data":"fb962b1a42b76bc3e87afe578dcaaee6c19cd8250ca1338eebd0cec9ee4c259d"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.476481 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7wrc2" event={"ID":"c00f5d00-73a8-4268-acc9-49f809cf6d7f","Type":"ContainerStarted","Data":"24e1e708b5358a5afd58f060787e1ff245958125b4d36a41a5a823dfeefd9233"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.478839 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghx6h" event={"ID":"b7908371-b4b8-4437-be4a-13b8fccb6a9f","Type":"ContainerStarted","Data":"4c6e16e7573daea38fba8c9b521c63b389bec9e5a2a7652504ef673f321c378b"} Mar 18 12:31:37 crc kubenswrapper[4921]: I0318 12:31:37.532540 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7wrc2" podStartSLOduration=3.532521706 podStartE2EDuration="3.532521706s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:37.531333222 +0000 UTC m=+1317.081253861" watchObservedRunningTime="2026-03-18 12:31:37.532521706 +0000 UTC m=+1317.082442345" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.147967 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.294259 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-sb\") pod \"88102bf2-01c2-4cfe-a798-4deac7803ec0\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.294986 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdn5j\" (UniqueName: \"kubernetes.io/projected/88102bf2-01c2-4cfe-a798-4deac7803ec0-kube-api-access-wdn5j\") pod \"88102bf2-01c2-4cfe-a798-4deac7803ec0\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.295153 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-config\") pod \"88102bf2-01c2-4cfe-a798-4deac7803ec0\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.295197 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-swift-storage-0\") pod \"88102bf2-01c2-4cfe-a798-4deac7803ec0\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.295321 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-svc\") pod \"88102bf2-01c2-4cfe-a798-4deac7803ec0\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.295351 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-nb\") pod \"88102bf2-01c2-4cfe-a798-4deac7803ec0\" (UID: \"88102bf2-01c2-4cfe-a798-4deac7803ec0\") " Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.303036 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88102bf2-01c2-4cfe-a798-4deac7803ec0-kube-api-access-wdn5j" (OuterVolumeSpecName: "kube-api-access-wdn5j") pod "88102bf2-01c2-4cfe-a798-4deac7803ec0" (UID: "88102bf2-01c2-4cfe-a798-4deac7803ec0"). InnerVolumeSpecName "kube-api-access-wdn5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.342161 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88102bf2-01c2-4cfe-a798-4deac7803ec0" (UID: "88102bf2-01c2-4cfe-a798-4deac7803ec0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.342766 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "88102bf2-01c2-4cfe-a798-4deac7803ec0" (UID: "88102bf2-01c2-4cfe-a798-4deac7803ec0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.351808 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-config" (OuterVolumeSpecName: "config") pod "88102bf2-01c2-4cfe-a798-4deac7803ec0" (UID: "88102bf2-01c2-4cfe-a798-4deac7803ec0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.361355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "88102bf2-01c2-4cfe-a798-4deac7803ec0" (UID: "88102bf2-01c2-4cfe-a798-4deac7803ec0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.366327 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "88102bf2-01c2-4cfe-a798-4deac7803ec0" (UID: "88102bf2-01c2-4cfe-a798-4deac7803ec0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.398219 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdn5j\" (UniqueName: \"kubernetes.io/projected/88102bf2-01c2-4cfe-a798-4deac7803ec0-kube-api-access-wdn5j\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.398256 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.398266 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.398275 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.398283 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.398293 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88102bf2-01c2-4cfe-a798-4deac7803ec0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.503861 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" event={"ID":"cc6a9d7f-27e2-4c00-8077-6e1044ad20af","Type":"ContainerStarted","Data":"aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6"} Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.504245 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.508352 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" event={"ID":"88102bf2-01c2-4cfe-a798-4deac7803ec0","Type":"ContainerDied","Data":"671ebcfb87e292cc1a49696f06d770712d592dccf1fb4e1520b0adee2f896c52"} Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.508395 4921 scope.go:117] "RemoveContainer" containerID="e0968ce981fe39c85e475e4db3a09fc96f64003564249eb6c45312f4a9c19dbc" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.508530 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-5bplm" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.537726 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b2dc6b10-845f-455a-828b-36e6eafc21f4","Type":"ContainerStarted","Data":"8cb60616b1f4ac5541b06f4fd081b5acc35b59da0d645fb8a6a48e2b49e2a055"} Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.551399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7ecbdd9-b034-4731-b13b-65deed2224aa","Type":"ContainerStarted","Data":"3ed10498fb58e1c19bd900105cf8f9e70943729f6e0a3bcc767de43309b34559"} Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.553235 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" podStartSLOduration=4.553208975 podStartE2EDuration="4.553208975s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:38.53823454 +0000 UTC m=+1318.088155199" watchObservedRunningTime="2026-03-18 12:31:38.553208975 +0000 UTC m=+1318.103129614" Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.608709 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-5bplm"] Mar 18 12:31:38 crc kubenswrapper[4921]: I0318 12:31:38.695502 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-5bplm"] Mar 18 12:31:38 crc kubenswrapper[4921]: E0318 12:31:38.784305 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88102bf2_01c2_4cfe_a798_4deac7803ec0.slice\": RecentStats: unable to find data in memory cache]" Mar 18 12:31:39 crc kubenswrapper[4921]: I0318 12:31:39.229387 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88102bf2-01c2-4cfe-a798-4deac7803ec0" path="/var/lib/kubelet/pods/88102bf2-01c2-4cfe-a798-4deac7803ec0/volumes" Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.583154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b2dc6b10-845f-455a-828b-36e6eafc21f4","Type":"ContainerStarted","Data":"58b26352299d15380b74dc8d7669f5decdb94c4de3d713570c141dbcfffc2c8f"} Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.584491 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-log" containerID="cri-o://8cb60616b1f4ac5541b06f4fd081b5acc35b59da0d645fb8a6a48e2b49e2a055" gracePeriod=30 Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.585500 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-httpd" containerID="cri-o://58b26352299d15380b74dc8d7669f5decdb94c4de3d713570c141dbcfffc2c8f" gracePeriod=30 Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.591515 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7ecbdd9-b034-4731-b13b-65deed2224aa","Type":"ContainerStarted","Data":"e2f49f027bcc8372642f1cae8695b51fe2ba6bf8159b6edec7944d52b8aec1e8"} Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.591753 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-log" containerID="cri-o://3ed10498fb58e1c19bd900105cf8f9e70943729f6e0a3bcc767de43309b34559" gracePeriod=30 Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.591895 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-httpd" containerID="cri-o://e2f49f027bcc8372642f1cae8695b51fe2ba6bf8159b6edec7944d52b8aec1e8" gracePeriod=30 Mar 18 12:31:40 crc kubenswrapper[4921]: I0318 12:31:40.650031 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.650008958 podStartE2EDuration="6.650008958s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:40.61417826 +0000 UTC m=+1320.164098899" watchObservedRunningTime="2026-03-18 12:31:40.650008958 +0000 UTC m=+1320.199929597" Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.251810 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.251791731 podStartE2EDuration="7.251791731s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:40.648422623 +0000 UTC m=+1320.198343272" watchObservedRunningTime="2026-03-18 12:31:41.251791731 +0000 UTC m=+1320.801712370" Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.611918 4921 generic.go:334] "Generic (PLEG): container finished" podID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerID="e2f49f027bcc8372642f1cae8695b51fe2ba6bf8159b6edec7944d52b8aec1e8" exitCode=0 Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.611953 4921 generic.go:334] "Generic (PLEG): container finished" podID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerID="3ed10498fb58e1c19bd900105cf8f9e70943729f6e0a3bcc767de43309b34559" exitCode=143 Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.612029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7ecbdd9-b034-4731-b13b-65deed2224aa","Type":"ContainerDied","Data":"e2f49f027bcc8372642f1cae8695b51fe2ba6bf8159b6edec7944d52b8aec1e8"} Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.612191 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7ecbdd9-b034-4731-b13b-65deed2224aa","Type":"ContainerDied","Data":"3ed10498fb58e1c19bd900105cf8f9e70943729f6e0a3bcc767de43309b34559"} Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.614517 4921 generic.go:334] "Generic (PLEG): container finished" podID="ed1c9bae-63a4-4e5d-8c81-3113898f71c7" containerID="3304fa91e8bbabf4082cc5197900d74b2b6e74d5eb12fcdb90a6a309eb2fc462" exitCode=0 Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.614598 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9m8l" event={"ID":"ed1c9bae-63a4-4e5d-8c81-3113898f71c7","Type":"ContainerDied","Data":"3304fa91e8bbabf4082cc5197900d74b2b6e74d5eb12fcdb90a6a309eb2fc462"} Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.619255 4921 generic.go:334] "Generic (PLEG): container finished" podID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerID="58b26352299d15380b74dc8d7669f5decdb94c4de3d713570c141dbcfffc2c8f" exitCode=0 Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.619291 4921 generic.go:334] "Generic (PLEG): container finished" podID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerID="8cb60616b1f4ac5541b06f4fd081b5acc35b59da0d645fb8a6a48e2b49e2a055" exitCode=143 Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.619319 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b2dc6b10-845f-455a-828b-36e6eafc21f4","Type":"ContainerDied","Data":"58b26352299d15380b74dc8d7669f5decdb94c4de3d713570c141dbcfffc2c8f"} Mar 18 12:31:41 crc kubenswrapper[4921]: I0318 12:31:41.619349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b2dc6b10-845f-455a-828b-36e6eafc21f4","Type":"ContainerDied","Data":"8cb60616b1f4ac5541b06f4fd081b5acc35b59da0d645fb8a6a48e2b49e2a055"} Mar 18 12:31:45 crc kubenswrapper[4921]: I0318 12:31:45.591333 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:31:45 crc kubenswrapper[4921]: I0318 12:31:45.657479 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gkwk7"] Mar 18 12:31:45 crc kubenswrapper[4921]: I0318 12:31:45.657772 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gkwk7" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" containerID="cri-o://41fa41eca02ba769262b6d0dbf93db9fda957aa59e98c9cf862735f757d8a357" gracePeriod=10 Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.154252 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.167382 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258570 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-httpd-run\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258662 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-combined-ca-bundle\") pod \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258696 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-config-data\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258727 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5qzh\" (UniqueName: \"kubernetes.io/projected/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-kube-api-access-f5qzh\") pod \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258776 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-fernet-keys\") pod \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258804 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-combined-ca-bundle\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258863 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-credential-keys\") pod \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258925 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-scripts\") pod \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.258971 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-logs\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.259041 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-config-data\") pod \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\" (UID: \"ed1c9bae-63a4-4e5d-8c81-3113898f71c7\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.259061 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.259161 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-scripts\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.259188 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgzfr\" (UniqueName: \"kubernetes.io/projected/c7ecbdd9-b034-4731-b13b-65deed2224aa-kube-api-access-zgzfr\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.259244 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-public-tls-certs\") pod \"c7ecbdd9-b034-4731-b13b-65deed2224aa\" (UID: \"c7ecbdd9-b034-4731-b13b-65deed2224aa\") " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.259595 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.262297 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-logs" (OuterVolumeSpecName: "logs") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.277715 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-scripts" (OuterVolumeSpecName: "scripts") pod "ed1c9bae-63a4-4e5d-8c81-3113898f71c7" (UID: "ed1c9bae-63a4-4e5d-8c81-3113898f71c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.279630 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-kube-api-access-f5qzh" (OuterVolumeSpecName: "kube-api-access-f5qzh") pod "ed1c9bae-63a4-4e5d-8c81-3113898f71c7" (UID: "ed1c9bae-63a4-4e5d-8c81-3113898f71c7"). InnerVolumeSpecName "kube-api-access-f5qzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.283429 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.288334 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed1c9bae-63a4-4e5d-8c81-3113898f71c7" (UID: "ed1c9bae-63a4-4e5d-8c81-3113898f71c7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.293346 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed1c9bae-63a4-4e5d-8c81-3113898f71c7" (UID: "ed1c9bae-63a4-4e5d-8c81-3113898f71c7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.309261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-scripts" (OuterVolumeSpecName: "scripts") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.339246 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ecbdd9-b034-4731-b13b-65deed2224aa-kube-api-access-zgzfr" (OuterVolumeSpecName: "kube-api-access-zgzfr") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "kube-api-access-zgzfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.351681 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.377610 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed1c9bae-63a4-4e5d-8c81-3113898f71c7" (UID: "ed1c9bae-63a4-4e5d-8c81-3113898f71c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.377967 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378094 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378155 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5qzh\" (UniqueName: \"kubernetes.io/projected/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-kube-api-access-f5qzh\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378170 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378250 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378267 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378281 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378293 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7ecbdd9-b034-4731-b13b-65deed2224aa-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378339 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378354 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.378365 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgzfr\" (UniqueName: \"kubernetes.io/projected/c7ecbdd9-b034-4731-b13b-65deed2224aa-kube-api-access-zgzfr\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.388032 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-config-data" (OuterVolumeSpecName: "config-data") pod "ed1c9bae-63a4-4e5d-8c81-3113898f71c7" (UID: "ed1c9bae-63a4-4e5d-8c81-3113898f71c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.414744 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-config-data" (OuterVolumeSpecName: "config-data") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.417260 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.434703 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7ecbdd9-b034-4731-b13b-65deed2224aa" (UID: "c7ecbdd9-b034-4731-b13b-65deed2224aa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.480503 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.480533 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7ecbdd9-b034-4731-b13b-65deed2224aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.480542 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed1c9bae-63a4-4e5d-8c81-3113898f71c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.480552 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.681392 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s9m8l" event={"ID":"ed1c9bae-63a4-4e5d-8c81-3113898f71c7","Type":"ContainerDied","Data":"e6312b8072ca2869aa57e499e58ce87d3ad88e6d4e3181d417eb5b83ebf2b9b7"} Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.681470 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6312b8072ca2869aa57e499e58ce87d3ad88e6d4e3181d417eb5b83ebf2b9b7" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.681400 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s9m8l" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.686084 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.686076 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7ecbdd9-b034-4731-b13b-65deed2224aa","Type":"ContainerDied","Data":"95cb2a8c8ee0f43fb859f8ca23a46d176c88838af73f340e008e85642ce66cd7"} Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.686258 4921 scope.go:117] "RemoveContainer" containerID="e2f49f027bcc8372642f1cae8695b51fe2ba6bf8159b6edec7944d52b8aec1e8" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.694449 4921 generic.go:334] "Generic (PLEG): container finished" podID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerID="41fa41eca02ba769262b6d0dbf93db9fda957aa59e98c9cf862735f757d8a357" exitCode=0 Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.694521 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gkwk7" event={"ID":"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938","Type":"ContainerDied","Data":"41fa41eca02ba769262b6d0dbf93db9fda957aa59e98c9cf862735f757d8a357"} Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.730374 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.741457 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.766767 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:46 crc kubenswrapper[4921]: E0318 12:31:46.767227 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1c9bae-63a4-4e5d-8c81-3113898f71c7" containerName="keystone-bootstrap" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767246 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1c9bae-63a4-4e5d-8c81-3113898f71c7" containerName="keystone-bootstrap" Mar 18 12:31:46 crc kubenswrapper[4921]: E0318 12:31:46.767258 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-log" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767266 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-log" Mar 18 12:31:46 crc kubenswrapper[4921]: E0318 12:31:46.767281 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-httpd" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767289 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-httpd" Mar 18 12:31:46 crc kubenswrapper[4921]: E0318 12:31:46.767312 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88102bf2-01c2-4cfe-a798-4deac7803ec0" containerName="init" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767318 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88102bf2-01c2-4cfe-a798-4deac7803ec0" containerName="init" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767467 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1c9bae-63a4-4e5d-8c81-3113898f71c7" containerName="keystone-bootstrap" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767484 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-log" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767501 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" containerName="glance-httpd" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.767518 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88102bf2-01c2-4cfe-a798-4deac7803ec0" containerName="init" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.768557 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.772256 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.772860 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.791707 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.889849 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvd2b\" (UniqueName: \"kubernetes.io/projected/9f4a58e9-3870-4b79-bbb6-6ec610898b96-kube-api-access-mvd2b\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.889926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.889950 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.889969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.889996 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.890024 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-logs\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.890145 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.890402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992075 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992177 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvd2b\" (UniqueName: \"kubernetes.io/projected/9f4a58e9-3870-4b79-bbb6-6ec610898b96-kube-api-access-mvd2b\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992331 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992360 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992424 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-logs\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992472 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992713 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.992772 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-logs\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:46 crc kubenswrapper[4921]: I0318 12:31:46.997220 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.002186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.002505 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.003150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.013783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvd2b\" (UniqueName: \"kubernetes.io/projected/9f4a58e9-3870-4b79-bbb6-6ec610898b96-kube-api-access-mvd2b\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.024051 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.081065 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.081154 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.090927 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.228234 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ecbdd9-b034-4731-b13b-65deed2224aa" path="/var/lib/kubelet/pods/c7ecbdd9-b034-4731-b13b-65deed2224aa/volumes" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.376073 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s9m8l"] Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.383823 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s9m8l"] Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.549224 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sthxw"] Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.551016 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.563235 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.575302 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kqlk9" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.575912 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.576144 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.576334 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.601232 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sthxw"] Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.628302 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-credential-keys\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.628366 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-scripts\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.628386 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchmw\" (UniqueName: \"kubernetes.io/projected/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-kube-api-access-tchmw\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.628440 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-combined-ca-bundle\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.628486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-fernet-keys\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.628601 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-config-data\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.734204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-config-data\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.734303 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-credential-keys\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.734336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-scripts\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.734363 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchmw\" (UniqueName: \"kubernetes.io/projected/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-kube-api-access-tchmw\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.734401 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-combined-ca-bundle\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.734427 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-fernet-keys\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.743640 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-combined-ca-bundle\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.744140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-config-data\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.746397 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-fernet-keys\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.747504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-scripts\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.751263 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-credential-keys\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.758193 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchmw\" (UniqueName: \"kubernetes.io/projected/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-kube-api-access-tchmw\") pod \"keystone-bootstrap-sthxw\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:47 crc kubenswrapper[4921]: I0318 12:31:47.958742 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:31:49 crc kubenswrapper[4921]: I0318 12:31:49.220771 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1c9bae-63a4-4e5d-8c81-3113898f71c7" path="/var/lib/kubelet/pods/ed1c9bae-63a4-4e5d-8c81-3113898f71c7/volumes" Mar 18 12:31:54 crc kubenswrapper[4921]: I0318 12:31:54.076871 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gkwk7" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.077821 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gkwk7" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 18 12:31:59 crc kubenswrapper[4921]: E0318 12:31:59.196266 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 12:31:59 crc kubenswrapper[4921]: E0318 12:31:59.196471 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j2lpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nszlz_openstack(0022dc9f-31d2-440f-831a-ae0a03c22b63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:31:59 crc kubenswrapper[4921]: E0318 12:31:59.199478 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nszlz" podUID="0022dc9f-31d2-440f-831a-ae0a03c22b63" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.227928 4921 scope.go:117] "RemoveContainer" containerID="3ed10498fb58e1c19bd900105cf8f9e70943729f6e0a3bcc767de43309b34559" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.247672 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.274820 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381129 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd8v8\" (UniqueName: \"kubernetes.io/projected/b2dc6b10-845f-455a-828b-36e6eafc21f4-kube-api-access-wd8v8\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381200 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x6b9\" (UniqueName: \"kubernetes.io/projected/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-kube-api-access-8x6b9\") pod \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381259 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-logs\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381304 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-internal-tls-certs\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381333 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-nb\") pod \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381353 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381377 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-config\") pod \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381543 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-dns-svc\") pod \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381621 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-combined-ca-bundle\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381645 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-config-data\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381704 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-httpd-run\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381747 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-sb\") pod \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\" (UID: \"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.381797 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-scripts\") pod \"b2dc6b10-845f-455a-828b-36e6eafc21f4\" (UID: \"b2dc6b10-845f-455a-828b-36e6eafc21f4\") " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.388793 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.388991 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-scripts" (OuterVolumeSpecName: "scripts") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.389204 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-logs" (OuterVolumeSpecName: "logs") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.389274 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-kube-api-access-8x6b9" (OuterVolumeSpecName: "kube-api-access-8x6b9") pod "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" (UID: "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938"). InnerVolumeSpecName "kube-api-access-8x6b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.396349 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.413354 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2dc6b10-845f-455a-828b-36e6eafc21f4-kube-api-access-wd8v8" (OuterVolumeSpecName: "kube-api-access-wd8v8") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "kube-api-access-wd8v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.452398 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.474077 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" (UID: "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.476544 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-config" (OuterVolumeSpecName: "config") pod "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" (UID: "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.484700 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" (UID: "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.488855 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd8v8\" (UniqueName: \"kubernetes.io/projected/b2dc6b10-845f-455a-828b-36e6eafc21f4-kube-api-access-wd8v8\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.488891 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x6b9\" (UniqueName: \"kubernetes.io/projected/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-kube-api-access-8x6b9\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.488904 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499659 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499702 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499739 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499752 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499768 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2dc6b10-845f-455a-828b-36e6eafc21f4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499782 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.499802 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.515900 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.551997 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-config-data" (OuterVolumeSpecName: "config-data") pod "b2dc6b10-845f-455a-828b-36e6eafc21f4" (UID: "b2dc6b10-845f-455a-828b-36e6eafc21f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.554298 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" (UID: "fd9e8b49-6e29-454b-bbd8-2b2e3d45b938"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.572742 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.601725 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.602186 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.602200 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.602209 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2dc6b10-845f-455a-828b-36e6eafc21f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.848909 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gkwk7" event={"ID":"fd9e8b49-6e29-454b-bbd8-2b2e3d45b938","Type":"ContainerDied","Data":"b29b1b29b8e30e1739d13e2af677d49561f637553fecce3037de35d4fb297e70"} Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.848955 4921 scope.go:117] "RemoveContainer" containerID="41fa41eca02ba769262b6d0dbf93db9fda957aa59e98c9cf862735f757d8a357" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.848975 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gkwk7" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.879189 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p4hhn" event={"ID":"5cce4ccd-6a44-4c00-b7f5-9c74946eb308","Type":"ContainerStarted","Data":"5a591efba1fea098e4e2ee9a5031ae5425e8b92bd58ca9d3acceb4eae64db51a"} Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.882238 4921 scope.go:117] "RemoveContainer" containerID="3bc6b45b28d208ccd35f4737c3c49593f8e255a8ff5900ff124377ea2b972816" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.883270 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghx6h" event={"ID":"b7908371-b4b8-4437-be4a-13b8fccb6a9f","Type":"ContainerStarted","Data":"f72fcb864229e0724c793833de3329ec5c8607f10e0f065d1a32fcdf40c91c4c"} Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.899297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b2dc6b10-845f-455a-828b-36e6eafc21f4","Type":"ContainerDied","Data":"489a53e268288468e9d3596aca8d5681724ac913c2497088eeb8c1495e699e9f"} Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.899653 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.906095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerStarted","Data":"5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0"} Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.926493 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-p4hhn" podStartSLOduration=2.8251705339999997 podStartE2EDuration="25.926444443s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="2026-03-18 12:31:36.160317958 +0000 UTC m=+1315.710238597" lastFinishedPulling="2026-03-18 12:31:59.261591867 +0000 UTC m=+1338.811512506" observedRunningTime="2026-03-18 12:31:59.906360162 +0000 UTC m=+1339.456280801" watchObservedRunningTime="2026-03-18 12:31:59.926444443 +0000 UTC m=+1339.476365082" Mar 18 12:31:59 crc kubenswrapper[4921]: E0318 12:31:59.926939 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-nszlz" podUID="0022dc9f-31d2-440f-831a-ae0a03c22b63" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.927204 4921 scope.go:117] "RemoveContainer" containerID="58b26352299d15380b74dc8d7669f5decdb94c4de3d713570c141dbcfffc2c8f" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.977390 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gkwk7"] Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.982513 4921 scope.go:117] "RemoveContainer" containerID="8cb60616b1f4ac5541b06f4fd081b5acc35b59da0d645fb8a6a48e2b49e2a055" Mar 18 12:31:59 crc kubenswrapper[4921]: I0318 12:31:59.996238 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gkwk7"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.000306 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ghx6h" podStartSLOduration=3.108685771 podStartE2EDuration="26.000283791s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="2026-03-18 12:31:36.420934485 +0000 UTC m=+1315.970855124" lastFinishedPulling="2026-03-18 12:31:59.312532505 +0000 UTC m=+1338.862453144" observedRunningTime="2026-03-18 12:31:59.975275811 +0000 UTC m=+1339.525196450" watchObservedRunningTime="2026-03-18 12:32:00.000283791 +0000 UTC m=+1339.550204430" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.033231 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.042169 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.105340 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sthxw"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.123628 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:00 crc kubenswrapper[4921]: E0318 12:32:00.124067 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.124086 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" Mar 18 12:32:00 crc kubenswrapper[4921]: E0318 12:32:00.124096 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-httpd" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.128185 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-httpd" Mar 18 12:32:00 crc kubenswrapper[4921]: E0318 12:32:00.128274 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="init" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.128283 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="init" Mar 18 12:32:00 crc kubenswrapper[4921]: E0318 12:32:00.128323 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-log" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.128334 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-log" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.128634 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-log" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.128652 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" containerName="glance-httpd" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.128682 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.129754 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.134547 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.134791 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.183237 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.194785 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563952-lfdc6"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.196166 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.200542 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.200902 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.201084 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.206518 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-lfdc6"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221368 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221438 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221472 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-logs\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221519 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221541 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221569 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpwd\" (UniqueName: \"kubernetes.io/projected/879edc6c-5a15-4316-9f8f-58bcf8d87b95-kube-api-access-4qpwd\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.221678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.224038 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.322938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpwd\" (UniqueName: \"kubernetes.io/projected/879edc6c-5a15-4316-9f8f-58bcf8d87b95-kube-api-access-4qpwd\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323002 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323085 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6pbl\" (UniqueName: \"kubernetes.io/projected/1d39e338-b192-4d26-b34e-06f358a643f3-kube-api-access-q6pbl\") pod \"auto-csr-approver-29563952-lfdc6\" (UID: \"1d39e338-b192-4d26-b34e-06f358a643f3\") " pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323143 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323195 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323254 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-logs\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323349 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.323376 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.324399 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.327148 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-logs\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.329520 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.329648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.333174 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.335328 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.338813 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.357654 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpwd\" (UniqueName: \"kubernetes.io/projected/879edc6c-5a15-4316-9f8f-58bcf8d87b95-kube-api-access-4qpwd\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.383455 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.425156 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6pbl\" (UniqueName: \"kubernetes.io/projected/1d39e338-b192-4d26-b34e-06f358a643f3-kube-api-access-q6pbl\") pod \"auto-csr-approver-29563952-lfdc6\" (UID: \"1d39e338-b192-4d26-b34e-06f358a643f3\") " pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.446256 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6pbl\" (UniqueName: \"kubernetes.io/projected/1d39e338-b192-4d26-b34e-06f358a643f3-kube-api-access-q6pbl\") pod \"auto-csr-approver-29563952-lfdc6\" (UID: \"1d39e338-b192-4d26-b34e-06f358a643f3\") " pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.479755 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.499872 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.918456 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sthxw" event={"ID":"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69","Type":"ContainerStarted","Data":"134fe1b336feadcd0f94e532c2e60882555cb33bbe21adb7cf4ca856bed8d851"} Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.918910 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sthxw" event={"ID":"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69","Type":"ContainerStarted","Data":"1ad1fef45b144530d5422cf35acc81a64137fe8d6486ca012ac6be64f182684d"} Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.927510 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f4a58e9-3870-4b79-bbb6-6ec610898b96","Type":"ContainerStarted","Data":"1c88dd00f26cc529669a6320c09b03e3bbd8747345b20fcfd4d183ea8ba4060b"} Mar 18 12:32:00 crc kubenswrapper[4921]: I0318 12:32:00.942952 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sthxw" podStartSLOduration=13.942931073 podStartE2EDuration="13.942931073s" podCreationTimestamp="2026-03-18 12:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:00.93967672 +0000 UTC m=+1340.489597359" watchObservedRunningTime="2026-03-18 12:32:00.942931073 +0000 UTC m=+1340.492851712" Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.079137 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-lfdc6"] Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.202233 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.248248 4921 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbca08ff0-5b2f-4bb8-8624-05eacf4a7c8b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbca08ff0-5b2f-4bb8-8624-05eacf4a7c8b] : Timed out while waiting for systemd to remove kubepods-besteffort-podbca08ff0_5b2f_4bb8_8624_05eacf4a7c8b.slice" Mar 18 12:32:01 crc kubenswrapper[4921]: E0318 12:32:01.248302 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podbca08ff0-5b2f-4bb8-8624-05eacf4a7c8b] : unable to destroy cgroup paths for cgroup [kubepods besteffort podbca08ff0-5b2f-4bb8-8624-05eacf4a7c8b] : Timed out while waiting for systemd to remove kubepods-besteffort-podbca08ff0_5b2f_4bb8_8624_05eacf4a7c8b.slice" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" podUID="bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.272704 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2dc6b10-845f-455a-828b-36e6eafc21f4" path="/var/lib/kubelet/pods/b2dc6b10-845f-455a-828b-36e6eafc21f4/volumes" Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.274827 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" path="/var/lib/kubelet/pods/fd9e8b49-6e29-454b-bbd8-2b2e3d45b938/volumes" Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.969732 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"879edc6c-5a15-4316-9f8f-58bcf8d87b95","Type":"ContainerStarted","Data":"d33a6316a4c3fe7068742d37db1f80050b6301b34b172dd6d19c9e6e6cd08b88"} Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.971524 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" event={"ID":"1d39e338-b192-4d26-b34e-06f358a643f3","Type":"ContainerStarted","Data":"57f11a786ce723a5ee095fe1f42f973ee3b940fbc7f7edd90721b6b1ba15497f"} Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.986163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f4a58e9-3870-4b79-bbb6-6ec610898b96","Type":"ContainerStarted","Data":"9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509"} Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.986213 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-nbsg8" Mar 18 12:32:01 crc kubenswrapper[4921]: I0318 12:32:01.986231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f4a58e9-3870-4b79-bbb6-6ec610898b96","Type":"ContainerStarted","Data":"6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9"} Mar 18 12:32:02 crc kubenswrapper[4921]: I0318 12:32:02.067963 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-nbsg8"] Mar 18 12:32:02 crc kubenswrapper[4921]: I0318 12:32:02.082205 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-nbsg8"] Mar 18 12:32:02 crc kubenswrapper[4921]: I0318 12:32:02.086219 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.086136893 podStartE2EDuration="16.086136893s" podCreationTimestamp="2026-03-18 12:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:02.057702915 +0000 UTC m=+1341.607623554" watchObservedRunningTime="2026-03-18 12:32:02.086136893 +0000 UTC m=+1341.636057532" Mar 18 12:32:02 crc kubenswrapper[4921]: I0318 12:32:02.996796 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerStarted","Data":"7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8"} Mar 18 12:32:02 crc kubenswrapper[4921]: I0318 12:32:02.999693 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"879edc6c-5a15-4316-9f8f-58bcf8d87b95","Type":"ContainerStarted","Data":"34a5c49d33b50f9daecefad0b2a32f7081ca17c826b1bebb75452011680a9f66"} Mar 18 12:32:03 crc kubenswrapper[4921]: I0318 12:32:03.001155 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" event={"ID":"1d39e338-b192-4d26-b34e-06f358a643f3","Type":"ContainerStarted","Data":"ac7ed7ac22161ef7e0586585869b63f5c860106566f8be795e90e2efe3141fc5"} Mar 18 12:32:03 crc kubenswrapper[4921]: I0318 12:32:03.019182 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" podStartSLOduration=1.878525134 podStartE2EDuration="3.019164771s" podCreationTimestamp="2026-03-18 12:32:00 +0000 UTC" firstStartedPulling="2026-03-18 12:32:01.08955698 +0000 UTC m=+1340.639477619" lastFinishedPulling="2026-03-18 12:32:02.230196617 +0000 UTC m=+1341.780117256" observedRunningTime="2026-03-18 12:32:03.013231292 +0000 UTC m=+1342.563151931" watchObservedRunningTime="2026-03-18 12:32:03.019164771 +0000 UTC m=+1342.569085410" Mar 18 12:32:03 crc kubenswrapper[4921]: I0318 12:32:03.224084 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b" path="/var/lib/kubelet/pods/bca08ff0-5b2f-4bb8-8624-05eacf4a7c8b/volumes" Mar 18 12:32:04 crc kubenswrapper[4921]: I0318 12:32:04.012234 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"879edc6c-5a15-4316-9f8f-58bcf8d87b95","Type":"ContainerStarted","Data":"b8750eede6490ad6a6c09658966fb05cac1bc82d03658becf1a1c604c33190d2"} Mar 18 12:32:04 crc kubenswrapper[4921]: I0318 12:32:04.013881 4921 generic.go:334] "Generic (PLEG): container finished" podID="1d39e338-b192-4d26-b34e-06f358a643f3" containerID="ac7ed7ac22161ef7e0586585869b63f5c860106566f8be795e90e2efe3141fc5" exitCode=0 Mar 18 12:32:04 crc kubenswrapper[4921]: I0318 12:32:04.013927 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" event={"ID":"1d39e338-b192-4d26-b34e-06f358a643f3","Type":"ContainerDied","Data":"ac7ed7ac22161ef7e0586585869b63f5c860106566f8be795e90e2efe3141fc5"} Mar 18 12:32:04 crc kubenswrapper[4921]: I0318 12:32:04.042805 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.042783443 podStartE2EDuration="5.042783443s" podCreationTimestamp="2026-03-18 12:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:04.033601422 +0000 UTC m=+1343.583522051" watchObservedRunningTime="2026-03-18 12:32:04.042783443 +0000 UTC m=+1343.592704082" Mar 18 12:32:04 crc kubenswrapper[4921]: I0318 12:32:04.079269 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gkwk7" podUID="fd9e8b49-6e29-454b-bbd8-2b2e3d45b938" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 18 12:32:05 crc kubenswrapper[4921]: I0318 12:32:05.030229 4921 generic.go:334] "Generic (PLEG): container finished" podID="ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" containerID="134fe1b336feadcd0f94e532c2e60882555cb33bbe21adb7cf4ca856bed8d851" exitCode=0 Mar 18 12:32:05 crc kubenswrapper[4921]: I0318 12:32:05.031264 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sthxw" event={"ID":"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69","Type":"ContainerDied","Data":"134fe1b336feadcd0f94e532c2e60882555cb33bbe21adb7cf4ca856bed8d851"} Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.064466 4921 generic.go:334] "Generic (PLEG): container finished" podID="c00f5d00-73a8-4268-acc9-49f809cf6d7f" containerID="24e1e708b5358a5afd58f060787e1ff245958125b4d36a41a5a823dfeefd9233" exitCode=0 Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.064946 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7wrc2" event={"ID":"c00f5d00-73a8-4268-acc9-49f809cf6d7f","Type":"ContainerDied","Data":"24e1e708b5358a5afd58f060787e1ff245958125b4d36a41a5a823dfeefd9233"} Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.069588 4921 generic.go:334] "Generic (PLEG): container finished" podID="b7908371-b4b8-4437-be4a-13b8fccb6a9f" containerID="f72fcb864229e0724c793833de3329ec5c8607f10e0f065d1a32fcdf40c91c4c" exitCode=0 Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.069657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghx6h" event={"ID":"b7908371-b4b8-4437-be4a-13b8fccb6a9f","Type":"ContainerDied","Data":"f72fcb864229e0724c793833de3329ec5c8607f10e0f065d1a32fcdf40c91c4c"} Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.072653 4921 generic.go:334] "Generic (PLEG): container finished" podID="5cce4ccd-6a44-4c00-b7f5-9c74946eb308" containerID="5a591efba1fea098e4e2ee9a5031ae5425e8b92bd58ca9d3acceb4eae64db51a" exitCode=0 Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.072865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p4hhn" event={"ID":"5cce4ccd-6a44-4c00-b7f5-9c74946eb308","Type":"ContainerDied","Data":"5a591efba1fea098e4e2ee9a5031ae5425e8b92bd58ca9d3acceb4eae64db51a"} Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.875622 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:06 crc kubenswrapper[4921]: I0318 12:32:06.884087 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.007661 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6pbl\" (UniqueName: \"kubernetes.io/projected/1d39e338-b192-4d26-b34e-06f358a643f3-kube-api-access-q6pbl\") pod \"1d39e338-b192-4d26-b34e-06f358a643f3\" (UID: \"1d39e338-b192-4d26-b34e-06f358a643f3\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.007806 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-credential-keys\") pod \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.007855 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchmw\" (UniqueName: \"kubernetes.io/projected/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-kube-api-access-tchmw\") pod \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.007923 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-config-data\") pod \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.008011 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-fernet-keys\") pod \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.008060 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-combined-ca-bundle\") pod \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.008084 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-scripts\") pod \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\" (UID: \"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.016944 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" (UID: "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.017147 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" (UID: "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.018052 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d39e338-b192-4d26-b34e-06f358a643f3-kube-api-access-q6pbl" (OuterVolumeSpecName: "kube-api-access-q6pbl") pod "1d39e338-b192-4d26-b34e-06f358a643f3" (UID: "1d39e338-b192-4d26-b34e-06f358a643f3"). InnerVolumeSpecName "kube-api-access-q6pbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.030355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-kube-api-access-tchmw" (OuterVolumeSpecName: "kube-api-access-tchmw") pod "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" (UID: "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69"). InnerVolumeSpecName "kube-api-access-tchmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.030480 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-scripts" (OuterVolumeSpecName: "scripts") pod "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" (UID: "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.056044 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-config-data" (OuterVolumeSpecName: "config-data") pod "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" (UID: "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.062327 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" (UID: "ec467fe9-ec1a-4b58-a0b0-b745e4c41f69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.091336 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.091388 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.096745 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sthxw" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.099087 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sthxw" event={"ID":"ec467fe9-ec1a-4b58-a0b0-b745e4c41f69","Type":"ContainerDied","Data":"1ad1fef45b144530d5422cf35acc81a64137fe8d6486ca012ac6be64f182684d"} Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.099151 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ad1fef45b144530d5422cf35acc81a64137fe8d6486ca012ac6be64f182684d" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.104562 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.110872 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-lfdc6" event={"ID":"1d39e338-b192-4d26-b34e-06f358a643f3","Type":"ContainerDied","Data":"57f11a786ce723a5ee095fe1f42f973ee3b940fbc7f7edd90721b6b1ba15497f"} Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.110914 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f11a786ce723a5ee095fe1f42f973ee3b940fbc7f7edd90721b6b1ba15497f" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116621 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116868 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116880 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6pbl\" (UniqueName: \"kubernetes.io/projected/1d39e338-b192-4d26-b34e-06f358a643f3-kube-api-access-q6pbl\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116891 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116901 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchmw\" (UniqueName: \"kubernetes.io/projected/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-kube-api-access-tchmw\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116911 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.116919 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.165446 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.172591 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.175477 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56b6658ccd-lzk2m"] Mar 18 12:32:07 crc kubenswrapper[4921]: E0318 12:32:07.175855 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d39e338-b192-4d26-b34e-06f358a643f3" containerName="oc" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.175870 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d39e338-b192-4d26-b34e-06f358a643f3" containerName="oc" Mar 18 12:32:07 crc kubenswrapper[4921]: E0318 12:32:07.175883 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" containerName="keystone-bootstrap" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.175889 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" containerName="keystone-bootstrap" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.176052 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" containerName="keystone-bootstrap" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.176072 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d39e338-b192-4d26-b34e-06f358a643f3" containerName="oc" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.176604 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.178863 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.179399 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.179472 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.185508 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.185512 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kqlk9" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.185647 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.197902 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56b6658ccd-lzk2m"] Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-scripts\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345628 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-combined-ca-bundle\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-internal-tls-certs\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345710 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-public-tls-certs\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345732 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnl22\" (UniqueName: \"kubernetes.io/projected/5e6d9230-4481-43b3-891b-066a3bc6a46f-kube-api-access-rnl22\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345795 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-fernet-keys\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345820 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-config-data\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.345840 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-credential-keys\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.446918 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-combined-ca-bundle\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.446980 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-internal-tls-certs\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.447014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-public-tls-certs\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.447035 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnl22\" (UniqueName: \"kubernetes.io/projected/5e6d9230-4481-43b3-891b-066a3bc6a46f-kube-api-access-rnl22\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.447078 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-fernet-keys\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.447123 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-config-data\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.447141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-credential-keys\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.447166 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-scripts\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.453089 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-fernet-keys\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.453548 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-internal-tls-certs\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.453921 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-scripts\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.454021 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-public-tls-certs\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.454349 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-credential-keys\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.454779 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-combined-ca-bundle\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.457166 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-config-data\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.463660 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnl22\" (UniqueName: \"kubernetes.io/projected/5e6d9230-4481-43b3-891b-066a3bc6a46f-kube-api-access-rnl22\") pod \"keystone-56b6658ccd-lzk2m\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.474587 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.549654 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-db-sync-config-data\") pod \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.549749 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/b7908371-b4b8-4437-be4a-13b8fccb6a9f-kube-api-access-gt6qk\") pod \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.549790 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-combined-ca-bundle\") pod \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\" (UID: \"b7908371-b4b8-4437-be4a-13b8fccb6a9f\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.554696 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7908371-b4b8-4437-be4a-13b8fccb6a9f-kube-api-access-gt6qk" (OuterVolumeSpecName: "kube-api-access-gt6qk") pod "b7908371-b4b8-4437-be4a-13b8fccb6a9f" (UID: "b7908371-b4b8-4437-be4a-13b8fccb6a9f"). InnerVolumeSpecName "kube-api-access-gt6qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.556390 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.556445 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b7908371-b4b8-4437-be4a-13b8fccb6a9f" (UID: "b7908371-b4b8-4437-be4a-13b8fccb6a9f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.585819 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7908371-b4b8-4437-be4a-13b8fccb6a9f" (UID: "b7908371-b4b8-4437-be4a-13b8fccb6a9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.638311 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p4hhn" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.646176 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.657633 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.657661 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7908371-b4b8-4437-be4a-13b8fccb6a9f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.657671 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/b7908371-b4b8-4437-be4a-13b8fccb6a9f-kube-api-access-gt6qk\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758438 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-config-data\") pod \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758558 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-logs\") pod \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758610 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcvwl\" (UniqueName: \"kubernetes.io/projected/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-kube-api-access-lcvwl\") pod \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758641 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-scripts\") pod \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758695 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-combined-ca-bundle\") pod \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758725 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-config\") pod \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758826 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnhkg\" (UniqueName: \"kubernetes.io/projected/c00f5d00-73a8-4268-acc9-49f809cf6d7f-kube-api-access-pnhkg\") pod \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\" (UID: \"c00f5d00-73a8-4268-acc9-49f809cf6d7f\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.758883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-combined-ca-bundle\") pod \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\" (UID: \"5cce4ccd-6a44-4c00-b7f5-9c74946eb308\") " Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.759012 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-logs" (OuterVolumeSpecName: "logs") pod "5cce4ccd-6a44-4c00-b7f5-9c74946eb308" (UID: "5cce4ccd-6a44-4c00-b7f5-9c74946eb308"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.759469 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.762583 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-scripts" (OuterVolumeSpecName: "scripts") pod "5cce4ccd-6a44-4c00-b7f5-9c74946eb308" (UID: "5cce4ccd-6a44-4c00-b7f5-9c74946eb308"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.763177 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00f5d00-73a8-4268-acc9-49f809cf6d7f-kube-api-access-pnhkg" (OuterVolumeSpecName: "kube-api-access-pnhkg") pod "c00f5d00-73a8-4268-acc9-49f809cf6d7f" (UID: "c00f5d00-73a8-4268-acc9-49f809cf6d7f"). InnerVolumeSpecName "kube-api-access-pnhkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.766792 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-kube-api-access-lcvwl" (OuterVolumeSpecName: "kube-api-access-lcvwl") pod "5cce4ccd-6a44-4c00-b7f5-9c74946eb308" (UID: "5cce4ccd-6a44-4c00-b7f5-9c74946eb308"). InnerVolumeSpecName "kube-api-access-lcvwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.787773 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cce4ccd-6a44-4c00-b7f5-9c74946eb308" (UID: "5cce4ccd-6a44-4c00-b7f5-9c74946eb308"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.790363 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c00f5d00-73a8-4268-acc9-49f809cf6d7f" (UID: "c00f5d00-73a8-4268-acc9-49f809cf6d7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.793941 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-config" (OuterVolumeSpecName: "config") pod "c00f5d00-73a8-4268-acc9-49f809cf6d7f" (UID: "c00f5d00-73a8-4268-acc9-49f809cf6d7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.806240 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-config-data" (OuterVolumeSpecName: "config-data") pod "5cce4ccd-6a44-4c00-b7f5-9c74946eb308" (UID: "5cce4ccd-6a44-4c00-b7f5-9c74946eb308"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.863764 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnhkg\" (UniqueName: \"kubernetes.io/projected/c00f5d00-73a8-4268-acc9-49f809cf6d7f-kube-api-access-pnhkg\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.864276 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.864293 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.864305 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcvwl\" (UniqueName: \"kubernetes.io/projected/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-kube-api-access-lcvwl\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.864316 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce4ccd-6a44-4c00-b7f5-9c74946eb308-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.864328 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.864338 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c00f5d00-73a8-4268-acc9-49f809cf6d7f-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.939851 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-nw2kt"] Mar 18 12:32:07 crc kubenswrapper[4921]: I0318 12:32:07.947994 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-nw2kt"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.095774 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56b6658ccd-lzk2m"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.122550 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerStarted","Data":"e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c"} Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.125005 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ghx6h" event={"ID":"b7908371-b4b8-4437-be4a-13b8fccb6a9f","Type":"ContainerDied","Data":"4c6e16e7573daea38fba8c9b521c63b389bec9e5a2a7652504ef673f321c378b"} Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.125044 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c6e16e7573daea38fba8c9b521c63b389bec9e5a2a7652504ef673f321c378b" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.125066 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ghx6h" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.139665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p4hhn" event={"ID":"5cce4ccd-6a44-4c00-b7f5-9c74946eb308","Type":"ContainerDied","Data":"23a2fac68bcb55d5c66fa45f1af5767b5e3202edfddef79c117ac93631a60edd"} Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.139723 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a2fac68bcb55d5c66fa45f1af5767b5e3202edfddef79c117ac93631a60edd" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.140007 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p4hhn" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.149357 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56b6658ccd-lzk2m" event={"ID":"5e6d9230-4481-43b3-891b-066a3bc6a46f","Type":"ContainerStarted","Data":"48f7f18f2360134f5a41a86ebfb15d62951af8a158fe8a854d0c63aba32b959c"} Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.160309 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7wrc2" event={"ID":"c00f5d00-73a8-4268-acc9-49f809cf6d7f","Type":"ContainerDied","Data":"c0b666e908f82c701c828154dfe5b274321888a58cc70a1a2fb13154f7450ab9"} Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.160366 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b666e908f82c701c828154dfe5b274321888a58cc70a1a2fb13154f7450ab9" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.160442 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7wrc2" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.161610 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.161657 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.396412 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-zbt8l"] Mar 18 12:32:08 crc kubenswrapper[4921]: E0318 12:32:08.398756 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00f5d00-73a8-4268-acc9-49f809cf6d7f" containerName="neutron-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.398787 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00f5d00-73a8-4268-acc9-49f809cf6d7f" containerName="neutron-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: E0318 12:32:08.398817 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cce4ccd-6a44-4c00-b7f5-9c74946eb308" containerName="placement-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.398829 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cce4ccd-6a44-4c00-b7f5-9c74946eb308" containerName="placement-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: E0318 12:32:08.398846 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7908371-b4b8-4437-be4a-13b8fccb6a9f" containerName="barbican-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.398855 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7908371-b4b8-4437-be4a-13b8fccb6a9f" containerName="barbican-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.399143 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7908371-b4b8-4437-be4a-13b8fccb6a9f" containerName="barbican-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.399284 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c00f5d00-73a8-4268-acc9-49f809cf6d7f" containerName="neutron-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.399302 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cce4ccd-6a44-4c00-b7f5-9c74946eb308" containerName="placement-db-sync" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.400507 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.410497 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-zbt8l"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.476591 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6cc469779b-2mfpc"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.477159 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.477249 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-config\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.477320 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7f7h\" (UniqueName: \"kubernetes.io/projected/ba60f355-2584-4d37-ad84-8469a25be177-kube-api-access-f7f7h\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.477343 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.477369 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-svc\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.477392 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.483383 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.488356 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mjwkw" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.488665 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.488891 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.489004 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.490014 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.504853 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cc469779b-2mfpc"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.571705 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58479fdccb-s9k88"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.573405 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.580803 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-public-tls-certs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.580872 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-combined-ca-bundle\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.580926 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.580978 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-internal-tls-certs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.581213 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.581318 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6xm7m" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.581482 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.581585 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmknc\" (UniqueName: \"kubernetes.io/projected/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-kube-api-access-hmknc\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584317 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-config\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584407 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f7h\" (UniqueName: \"kubernetes.io/projected/ba60f355-2584-4d37-ad84-8469a25be177-kube-api-access-f7f7h\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584431 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-config-data\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584476 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-logs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584500 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-svc\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584537 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.584595 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-scripts\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.585489 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.585604 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.586283 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-config\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.586406 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-svc\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.586712 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.597309 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58479fdccb-s9k88"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.631262 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7f7h\" (UniqueName: \"kubernetes.io/projected/ba60f355-2584-4d37-ad84-8469a25be177-kube-api-access-f7f7h\") pod \"dnsmasq-dns-55f844cf75-zbt8l\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689738 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-config\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689789 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-combined-ca-bundle\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689835 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-public-tls-certs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689854 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-combined-ca-bundle\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689891 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-internal-tls-certs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689918 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmknc\" (UniqueName: \"kubernetes.io/projected/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-kube-api-access-hmknc\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689937 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszsn\" (UniqueName: \"kubernetes.io/projected/6d65e295-0914-4c05-bd33-fa99e512893e-kube-api-access-sszsn\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.689975 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-httpd-config\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.690000 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-ovndb-tls-certs\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.690027 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-config-data\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.690044 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-logs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.690084 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-scripts\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.693918 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-scripts\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.700724 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-public-tls-certs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.702549 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-logs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.702951 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-config-data\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.711074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-internal-tls-certs\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.711619 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-combined-ca-bundle\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.718880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmknc\" (UniqueName: \"kubernetes.io/projected/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-kube-api-access-hmknc\") pod \"placement-6cc469779b-2mfpc\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.761814 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.799993 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-config\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.800045 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-combined-ca-bundle\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.800133 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sszsn\" (UniqueName: \"kubernetes.io/projected/6d65e295-0914-4c05-bd33-fa99e512893e-kube-api-access-sszsn\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.800172 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-httpd-config\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.800200 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-ovndb-tls-certs\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.813568 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.815736 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-httpd-config\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.827736 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-combined-ca-bundle\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.836088 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-ovndb-tls-certs\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.853037 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-689c8956b9-wzd7n"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.874563 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszsn\" (UniqueName: \"kubernetes.io/projected/6d65e295-0914-4c05-bd33-fa99e512893e-kube-api-access-sszsn\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.875276 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.895485 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.896230 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nctw2" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.896936 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.897752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-config\") pod \"neutron-58479fdccb-s9k88\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.899349 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.907551 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-689c8956b9-wzd7n"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.971188 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c47f5f4db-j6swx"] Mar 18 12:32:08 crc kubenswrapper[4921]: I0318 12:32:08.972752 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.020378 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.035049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.035143 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ffm\" (UniqueName: \"kubernetes.io/projected/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-kube-api-access-j9ffm\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.035195 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-logs\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.035384 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data-custom\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.035426 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-combined-ca-bundle\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.063821 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c47f5f4db-j6swx"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.138483 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.138835 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ffm\" (UniqueName: \"kubernetes.io/projected/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-kube-api-access-j9ffm\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.138874 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-logs\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.138953 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.138985 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data-custom\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.143678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-logs\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.145506 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-combined-ca-bundle\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.145619 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data-custom\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.145709 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-combined-ca-bundle\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.145768 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18566d04-485b-411a-a1b8-e761a7fa6933-logs\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.145799 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ph46\" (UniqueName: \"kubernetes.io/projected/18566d04-485b-411a-a1b8-e761a7fa6933-kube-api-access-4ph46\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.170164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data-custom\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.174517 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-combined-ca-bundle\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.182352 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ffm\" (UniqueName: \"kubernetes.io/projected/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-kube-api-access-j9ffm\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.188431 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-zbt8l"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.196769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data\") pod \"barbican-worker-689c8956b9-wzd7n\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.200197 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-bjlc4"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.202074 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.213060 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-bjlc4"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.247887 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-combined-ca-bundle\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.247942 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18566d04-485b-411a-a1b8-e761a7fa6933-logs\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.247963 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ph46\" (UniqueName: \"kubernetes.io/projected/18566d04-485b-411a-a1b8-e761a7fa6933-kube-api-access-4ph46\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.248049 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.248093 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data-custom\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.254633 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data-custom\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.255277 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-combined-ca-bundle\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.255535 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18566d04-485b-411a-a1b8-e761a7fa6933-logs\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.259633 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.274395 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ph46\" (UniqueName: \"kubernetes.io/projected/18566d04-485b-411a-a1b8-e761a7fa6933-kube-api-access-4ph46\") pod \"barbican-keystone-listener-c47f5f4db-j6swx\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.304719 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4029d08d-77da-4e73-a89d-134ed55ea00b" path="/var/lib/kubelet/pods/4029d08d-77da-4e73-a89d-134ed55ea00b/volumes" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.305500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56b6658ccd-lzk2m" event={"ID":"5e6d9230-4481-43b3-891b-066a3bc6a46f","Type":"ContainerStarted","Data":"9cae82dc6d9adf77b41d7fa82acd4e2e90f6f5263c54efe2afe1b22b3ad1b371"} Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.305540 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.305553 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79d84c7864-6hqj5"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.307815 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d84c7864-6hqj5"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.307920 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.311681 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.317558 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56b6658ccd-lzk2m" podStartSLOduration=2.317518516 podStartE2EDuration="2.317518516s" podCreationTimestamp="2026-03-18 12:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:09.27333433 +0000 UTC m=+1348.823254959" watchObservedRunningTime="2026-03-18 12:32:09.317518516 +0000 UTC m=+1348.867439155" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.357516 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc48v\" (UniqueName: \"kubernetes.io/projected/f5295afc-4530-4c71-9671-19f54c2d73dc-kube-api-access-tc48v\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.357627 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-config\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.357733 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.357798 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.358212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.358312 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.384992 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.393723 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462200 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc48v\" (UniqueName: \"kubernetes.io/projected/f5295afc-4530-4c71-9671-19f54c2d73dc-kube-api-access-tc48v\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462677 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-config\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462721 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462748 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-combined-ca-bundle\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462771 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462872 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7t6g\" (UniqueName: \"kubernetes.io/projected/5af6c4d9-c843-47d6-b95e-a5e537231cc3-kube-api-access-m7t6g\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462943 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.462974 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af6c4d9-c843-47d6-b95e-a5e537231cc3-logs\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.463000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.463060 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data-custom\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.463714 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.463715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-config\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.463889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.464277 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.464626 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.480031 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc48v\" (UniqueName: \"kubernetes.io/projected/f5295afc-4530-4c71-9671-19f54c2d73dc-kube-api-access-tc48v\") pod \"dnsmasq-dns-85ff748b95-bjlc4\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.543932 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.568318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.568385 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7t6g\" (UniqueName: \"kubernetes.io/projected/5af6c4d9-c843-47d6-b95e-a5e537231cc3-kube-api-access-m7t6g\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.568442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af6c4d9-c843-47d6-b95e-a5e537231cc3-logs\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.568502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data-custom\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.568935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-combined-ca-bundle\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.574666 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af6c4d9-c843-47d6-b95e-a5e537231cc3-logs\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.588435 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data-custom\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.591777 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-combined-ca-bundle\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.620767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7t6g\" (UniqueName: \"kubernetes.io/projected/5af6c4d9-c843-47d6-b95e-a5e537231cc3-kube-api-access-m7t6g\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.647890 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data\") pod \"barbican-api-79d84c7864-6hqj5\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.730216 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-zbt8l"] Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.934886 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:09 crc kubenswrapper[4921]: I0318 12:32:09.989468 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cc469779b-2mfpc"] Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.091739 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58479fdccb-s9k88"] Mar 18 12:32:10 crc kubenswrapper[4921]: W0318 12:32:10.163259 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d65e295_0914_4c05_bd33_fa99e512893e.slice/crio-052423c1801110e93c59e4be562d37675607074a3c291981273b371af276de2d WatchSource:0}: Error finding container 052423c1801110e93c59e4be562d37675607074a3c291981273b371af276de2d: Status 404 returned error can't find the container with id 052423c1801110e93c59e4be562d37675607074a3c291981273b371af276de2d Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.274478 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c47f5f4db-j6swx"] Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.277390 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58479fdccb-s9k88" event={"ID":"6d65e295-0914-4c05-bd33-fa99e512893e","Type":"ContainerStarted","Data":"052423c1801110e93c59e4be562d37675607074a3c291981273b371af276de2d"} Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.327903 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-689c8956b9-wzd7n"] Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.388399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc469779b-2mfpc" event={"ID":"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71","Type":"ContainerStarted","Data":"7193354b788eb0607bbcde9142f899d1d30bcc2a2eaea982b5243974db236330"} Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.403292 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" event={"ID":"ba60f355-2584-4d37-ad84-8469a25be177","Type":"ContainerStarted","Data":"19e24efb8bbf70d0db10f6fb2c8a130c4e71d9363256099a920a568bb822a5b4"} Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.403369 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.403381 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.483319 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.483599 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.494383 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-bjlc4"] Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.528981 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.581704 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:10 crc kubenswrapper[4921]: I0318 12:32:10.935614 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d84c7864-6hqj5"] Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.418645 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58479fdccb-s9k88" event={"ID":"6d65e295-0914-4c05-bd33-fa99e512893e","Type":"ContainerStarted","Data":"6eb2bfc7a95745220c84b47f5d1b7fa05f6b59988bf8d3cfb91f23e2525cf1f5"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.422091 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc469779b-2mfpc" event={"ID":"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71","Type":"ContainerStarted","Data":"21996dae0b2690b430cf03b884c33f3ef56bfe6e6623a7ddd63437c6d50e1ff5"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.422148 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc469779b-2mfpc" event={"ID":"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71","Type":"ContainerStarted","Data":"ec64628734d6d0cbd273786bbce180980c769e1b8b9804c0a9567e94a61a1793"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.424672 4921 generic.go:334] "Generic (PLEG): container finished" podID="ba60f355-2584-4d37-ad84-8469a25be177" containerID="5207f0ff3eae09ea65c589c07f3edb75bd0c0fc7d0ecc937aa1f968305f84225" exitCode=0 Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.424716 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" event={"ID":"ba60f355-2584-4d37-ad84-8469a25be177","Type":"ContainerDied","Data":"5207f0ff3eae09ea65c589c07f3edb75bd0c0fc7d0ecc937aa1f968305f84225"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.448496 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" event={"ID":"18566d04-485b-411a-a1b8-e761a7fa6933","Type":"ContainerStarted","Data":"0e13cacfeb90b9a196839bf99eb26021ed4bdb56e3f74be9a8465811c11456ef"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.468265 4921 generic.go:334] "Generic (PLEG): container finished" podID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerID="971947caeb1e521b0ff741bb7efd35261a154dc64dcdc7ac5547eec8e970acd4" exitCode=0 Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.468346 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" event={"ID":"f5295afc-4530-4c71-9671-19f54c2d73dc","Type":"ContainerDied","Data":"971947caeb1e521b0ff741bb7efd35261a154dc64dcdc7ac5547eec8e970acd4"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.468371 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" event={"ID":"f5295afc-4530-4c71-9671-19f54c2d73dc","Type":"ContainerStarted","Data":"04e2505a5b93c207b8a1c7b96fc2937e24033c403229d184f6551d08722f72b3"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.476787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d84c7864-6hqj5" event={"ID":"5af6c4d9-c843-47d6-b95e-a5e537231cc3","Type":"ContainerStarted","Data":"1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.476824 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d84c7864-6hqj5" event={"ID":"5af6c4d9-c843-47d6-b95e-a5e537231cc3","Type":"ContainerStarted","Data":"c7b3245d2ecc6ae23681be849412d66a1316ac34e1e21f1339d4c59840e8d448"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.481223 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-689c8956b9-wzd7n" event={"ID":"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a","Type":"ContainerStarted","Data":"e1368e5fad63bd0bf40d1f93b49d2521faa2b23cb0717c5d6788756c91967068"} Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.481257 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.481268 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:11 crc kubenswrapper[4921]: I0318 12:32:11.980580 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.091672 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-sb\") pod \"ba60f355-2584-4d37-ad84-8469a25be177\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.092499 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-svc\") pod \"ba60f355-2584-4d37-ad84-8469a25be177\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.093185 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-swift-storage-0\") pod \"ba60f355-2584-4d37-ad84-8469a25be177\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.093464 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-nb\") pod \"ba60f355-2584-4d37-ad84-8469a25be177\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.093501 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-config\") pod \"ba60f355-2584-4d37-ad84-8469a25be177\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.093605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7f7h\" (UniqueName: \"kubernetes.io/projected/ba60f355-2584-4d37-ad84-8469a25be177-kube-api-access-f7f7h\") pod \"ba60f355-2584-4d37-ad84-8469a25be177\" (UID: \"ba60f355-2584-4d37-ad84-8469a25be177\") " Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.135470 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba60f355-2584-4d37-ad84-8469a25be177-kube-api-access-f7f7h" (OuterVolumeSpecName: "kube-api-access-f7f7h") pod "ba60f355-2584-4d37-ad84-8469a25be177" (UID: "ba60f355-2584-4d37-ad84-8469a25be177"). InnerVolumeSpecName "kube-api-access-f7f7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.149029 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-config" (OuterVolumeSpecName: "config") pod "ba60f355-2584-4d37-ad84-8469a25be177" (UID: "ba60f355-2584-4d37-ad84-8469a25be177"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.157900 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba60f355-2584-4d37-ad84-8469a25be177" (UID: "ba60f355-2584-4d37-ad84-8469a25be177"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.177171 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba60f355-2584-4d37-ad84-8469a25be177" (UID: "ba60f355-2584-4d37-ad84-8469a25be177"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.197840 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.197869 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.197880 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.197890 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7f7h\" (UniqueName: \"kubernetes.io/projected/ba60f355-2584-4d37-ad84-8469a25be177-kube-api-access-f7f7h\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.205236 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba60f355-2584-4d37-ad84-8469a25be177" (UID: "ba60f355-2584-4d37-ad84-8469a25be177"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.231217 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba60f355-2584-4d37-ad84-8469a25be177" (UID: "ba60f355-2584-4d37-ad84-8469a25be177"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.284135 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86b7dc884f-42l8h"] Mar 18 12:32:12 crc kubenswrapper[4921]: E0318 12:32:12.284634 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba60f355-2584-4d37-ad84-8469a25be177" containerName="init" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.284651 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba60f355-2584-4d37-ad84-8469a25be177" containerName="init" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.284919 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba60f355-2584-4d37-ad84-8469a25be177" containerName="init" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.286176 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.289456 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.290427 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.300907 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.300931 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba60f355-2584-4d37-ad84-8469a25be177-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.305102 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86b7dc884f-42l8h"] Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403628 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-config\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403698 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgl6j\" (UniqueName: \"kubernetes.io/projected/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-kube-api-access-xgl6j\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-combined-ca-bundle\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403777 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-public-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403827 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-ovndb-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-internal-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.403968 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-httpd-config\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.495302 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" event={"ID":"f5295afc-4530-4c71-9671-19f54c2d73dc","Type":"ContainerStarted","Data":"246cea279be0e6bb5e4e28a1065d801c827ce21596d6021eb95e25ada400d0f7"} Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.495450 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.504925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d84c7864-6hqj5" event={"ID":"5af6c4d9-c843-47d6-b95e-a5e537231cc3","Type":"ContainerStarted","Data":"6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37"} Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.505272 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.505502 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.505877 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-httpd-config\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.505939 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-config\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.505959 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgl6j\" (UniqueName: \"kubernetes.io/projected/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-kube-api-access-xgl6j\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.505989 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-combined-ca-bundle\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.506012 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-public-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.506042 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-ovndb-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.506066 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-internal-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.513584 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-config\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.513668 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-httpd-config\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.513690 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-internal-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.516782 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-combined-ca-bundle\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.521426 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-public-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.526399 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-ovndb-tls-certs\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.528282 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58479fdccb-s9k88" event={"ID":"6d65e295-0914-4c05-bd33-fa99e512893e","Type":"ContainerStarted","Data":"05e8733e82208abf68bb03b99d7513198eaf58296f32a65d158dc2165f661c48"} Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.530982 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.542485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" event={"ID":"ba60f355-2584-4d37-ad84-8469a25be177","Type":"ContainerDied","Data":"19e24efb8bbf70d0db10f6fb2c8a130c4e71d9363256099a920a568bb822a5b4"} Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.542578 4921 scope.go:117] "RemoveContainer" containerID="5207f0ff3eae09ea65c589c07f3edb75bd0c0fc7d0ecc937aa1f968305f84225" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.545672 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-zbt8l" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.547152 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.547206 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.547504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgl6j\" (UniqueName: \"kubernetes.io/projected/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-kube-api-access-xgl6j\") pod \"neutron-86b7dc884f-42l8h\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.557408 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" podStartSLOduration=4.557380616 podStartE2EDuration="4.557380616s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:12.51673067 +0000 UTC m=+1352.066651319" watchObservedRunningTime="2026-03-18 12:32:12.557380616 +0000 UTC m=+1352.107301255" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.585487 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79d84c7864-6hqj5" podStartSLOduration=3.585453634 podStartE2EDuration="3.585453634s" podCreationTimestamp="2026-03-18 12:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:12.541871715 +0000 UTC m=+1352.091792354" watchObservedRunningTime="2026-03-18 12:32:12.585453634 +0000 UTC m=+1352.135374273" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.656557 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58479fdccb-s9k88" podStartSLOduration=4.656503263 podStartE2EDuration="4.656503263s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:12.568696797 +0000 UTC m=+1352.118617446" watchObservedRunningTime="2026-03-18 12:32:12.656503263 +0000 UTC m=+1352.206423902" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.661472 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6cc469779b-2mfpc" podStartSLOduration=4.661459004 podStartE2EDuration="4.661459004s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:12.617658419 +0000 UTC m=+1352.167579088" watchObservedRunningTime="2026-03-18 12:32:12.661459004 +0000 UTC m=+1352.211379643" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.674889 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.739375 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-zbt8l"] Mar 18 12:32:12 crc kubenswrapper[4921]: I0318 12:32:12.760602 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-zbt8l"] Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.133371 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.133865 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.187698 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86b7dc884f-42l8h"] Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.229713 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba60f355-2584-4d37-ad84-8469a25be177" path="/var/lib/kubelet/pods/ba60f355-2584-4d37-ad84-8469a25be177/volumes" Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.233330 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.570047 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:13 crc kubenswrapper[4921]: I0318 12:32:13.570090 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.579098 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-689c8956b9-wzd7n" event={"ID":"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a","Type":"ContainerStarted","Data":"a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9"} Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.581297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" event={"ID":"18566d04-485b-411a-a1b8-e761a7fa6933","Type":"ContainerStarted","Data":"d769e05c72dbc3bfb0249264660c0d34a67576be10d78c4d0f42227432990ac5"} Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.582932 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86b7dc884f-42l8h" event={"ID":"a688bd96-47e0-4ae4-8e94-3c44f964b9e0","Type":"ContainerStarted","Data":"93b745602f033ec1a6fdfa900798b921c5b73da266fffb39f1ceaa11e6e673d5"} Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.582955 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86b7dc884f-42l8h" event={"ID":"a688bd96-47e0-4ae4-8e94-3c44f964b9e0","Type":"ContainerStarted","Data":"9b0b18186784b204a5f8998384f5413e967eea52ad4d64ba6bc5dc709997ff2b"} Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.596931 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.597020 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:32:14 crc kubenswrapper[4921]: I0318 12:32:14.753880 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.374263 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cf79cb9db-9pf9t"] Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.376185 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.378897 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.388668 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cf79cb9db-9pf9t"] Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.393694 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474508 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-public-tls-certs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474555 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474735 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-combined-ca-bundle\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474786 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4rsr\" (UniqueName: \"kubernetes.io/projected/08667791-7c42-46d1-a74b-436dfefa5db3-kube-api-access-k4rsr\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474809 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08667791-7c42-46d1-a74b-436dfefa5db3-logs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474825 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-internal-tls-certs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.474847 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data-custom\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576574 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-public-tls-certs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576630 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576672 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-combined-ca-bundle\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576689 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4rsr\" (UniqueName: \"kubernetes.io/projected/08667791-7c42-46d1-a74b-436dfefa5db3-kube-api-access-k4rsr\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576703 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08667791-7c42-46d1-a74b-436dfefa5db3-logs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576720 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-internal-tls-certs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.576737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data-custom\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.577244 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08667791-7c42-46d1-a74b-436dfefa5db3-logs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.583028 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.583184 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-combined-ca-bundle\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.593745 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" event={"ID":"18566d04-485b-411a-a1b8-e761a7fa6933","Type":"ContainerStarted","Data":"d3111e0bf89186d6ceed4c9cbb069267d0c8685ca3cf5793ac787eddf6cf6018"} Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.596379 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-public-tls-certs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.598336 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-internal-tls-certs\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.598562 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data-custom\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.599564 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86b7dc884f-42l8h" event={"ID":"a688bd96-47e0-4ae4-8e94-3c44f964b9e0","Type":"ContainerStarted","Data":"39e6a8d2a389dad40dfed60f96860330eb4be0119c8e89e962a17c2926a11993"} Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.599706 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.599720 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4rsr\" (UniqueName: \"kubernetes.io/projected/08667791-7c42-46d1-a74b-436dfefa5db3-kube-api-access-k4rsr\") pod \"barbican-api-cf79cb9db-9pf9t\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.618612 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-689c8956b9-wzd7n" event={"ID":"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a","Type":"ContainerStarted","Data":"59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632"} Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.624078 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" podStartSLOduration=3.7417019639999998 podStartE2EDuration="7.624061825s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="2026-03-18 12:32:10.276574163 +0000 UTC m=+1349.826494802" lastFinishedPulling="2026-03-18 12:32:14.158934024 +0000 UTC m=+1353.708854663" observedRunningTime="2026-03-18 12:32:15.616019816 +0000 UTC m=+1355.165940455" watchObservedRunningTime="2026-03-18 12:32:15.624061825 +0000 UTC m=+1355.173982464" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.653152 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86b7dc884f-42l8h" podStartSLOduration=3.653131121 podStartE2EDuration="3.653131121s" podCreationTimestamp="2026-03-18 12:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:15.643488507 +0000 UTC m=+1355.193409146" watchObservedRunningTime="2026-03-18 12:32:15.653131121 +0000 UTC m=+1355.203051760" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.662647 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-689c8956b9-wzd7n" podStartSLOduration=3.9328970180000002 podStartE2EDuration="7.662628701s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="2026-03-18 12:32:10.429298374 +0000 UTC m=+1349.979219013" lastFinishedPulling="2026-03-18 12:32:14.159030057 +0000 UTC m=+1353.708950696" observedRunningTime="2026-03-18 12:32:15.659609925 +0000 UTC m=+1355.209530564" watchObservedRunningTime="2026-03-18 12:32:15.662628701 +0000 UTC m=+1355.212549330" Mar 18 12:32:15 crc kubenswrapper[4921]: I0318 12:32:15.697397 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:17 crc kubenswrapper[4921]: I0318 12:32:17.081682 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:32:17 crc kubenswrapper[4921]: I0318 12:32:17.082077 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:32:19 crc kubenswrapper[4921]: I0318 12:32:19.546009 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:19 crc kubenswrapper[4921]: I0318 12:32:19.609931 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-x44rz"] Mar 18 12:32:19 crc kubenswrapper[4921]: I0318 12:32:19.610229 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerName="dnsmasq-dns" containerID="cri-o://aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6" gracePeriod=10 Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.461172 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.574471 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv77c\" (UniqueName: \"kubernetes.io/projected/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-kube-api-access-zv77c\") pod \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.574547 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-sb\") pod \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.574674 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-svc\") pod \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.574742 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-swift-storage-0\") pod \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.574772 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-config\") pod \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.574808 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-nb\") pod \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\" (UID: \"cc6a9d7f-27e2-4c00-8077-6e1044ad20af\") " Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.587287 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-kube-api-access-zv77c" (OuterVolumeSpecName: "kube-api-access-zv77c") pod "cc6a9d7f-27e2-4c00-8077-6e1044ad20af" (UID: "cc6a9d7f-27e2-4c00-8077-6e1044ad20af"). InnerVolumeSpecName "kube-api-access-zv77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.644361 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc6a9d7f-27e2-4c00-8077-6e1044ad20af" (UID: "cc6a9d7f-27e2-4c00-8077-6e1044ad20af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.648816 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc6a9d7f-27e2-4c00-8077-6e1044ad20af" (UID: "cc6a9d7f-27e2-4c00-8077-6e1044ad20af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.651685 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-config" (OuterVolumeSpecName: "config") pod "cc6a9d7f-27e2-4c00-8077-6e1044ad20af" (UID: "cc6a9d7f-27e2-4c00-8077-6e1044ad20af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.652993 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc6a9d7f-27e2-4c00-8077-6e1044ad20af" (UID: "cc6a9d7f-27e2-4c00-8077-6e1044ad20af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.654483 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc6a9d7f-27e2-4c00-8077-6e1044ad20af" (UID: "cc6a9d7f-27e2-4c00-8077-6e1044ad20af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.679291 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.679320 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv77c\" (UniqueName: \"kubernetes.io/projected/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-kube-api-access-zv77c\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.679330 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.679340 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.679350 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.679358 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6a9d7f-27e2-4c00-8077-6e1044ad20af-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.695132 4921 generic.go:334] "Generic (PLEG): container finished" podID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerID="aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6" exitCode=0 Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.695456 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.697175 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" event={"ID":"cc6a9d7f-27e2-4c00-8077-6e1044ad20af","Type":"ContainerDied","Data":"aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6"} Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.697261 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-x44rz" event={"ID":"cc6a9d7f-27e2-4c00-8077-6e1044ad20af","Type":"ContainerDied","Data":"fb962b1a42b76bc3e87afe578dcaaee6c19cd8250ca1338eebd0cec9ee4c259d"} Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.697288 4921 scope.go:117] "RemoveContainer" containerID="aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.709264 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerStarted","Data":"92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9"} Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.709678 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-central-agent" containerID="cri-o://5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0" gracePeriod=30 Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.709919 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.710222 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-notification-agent" containerID="cri-o://7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8" gracePeriod=30 Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.710192 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="proxy-httpd" containerID="cri-o://92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9" gracePeriod=30 Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.710259 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="sg-core" containerID="cri-o://e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c" gracePeriod=30 Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.730248 4921 scope.go:117] "RemoveContainer" containerID="a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.752140 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cf79cb9db-9pf9t"] Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.754929 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.617040038 podStartE2EDuration="46.754906998s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="2026-03-18 12:31:36.185639597 +0000 UTC m=+1315.735560236" lastFinishedPulling="2026-03-18 12:32:20.323506557 +0000 UTC m=+1359.873427196" observedRunningTime="2026-03-18 12:32:20.743685949 +0000 UTC m=+1360.293606588" watchObservedRunningTime="2026-03-18 12:32:20.754906998 +0000 UTC m=+1360.304827637" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.804233 4921 scope.go:117] "RemoveContainer" containerID="aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.814298 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-x44rz"] Mar 18 12:32:20 crc kubenswrapper[4921]: E0318 12:32:20.816167 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6\": container with ID starting with aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6 not found: ID does not exist" containerID="aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.816199 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6"} err="failed to get container status \"aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6\": rpc error: code = NotFound desc = could not find container \"aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6\": container with ID starting with aacd8452a03999743f43feb978c719952618fc69d92cdcb41c36910e888b69c6 not found: ID does not exist" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.816221 4921 scope.go:117] "RemoveContainer" containerID="a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e" Mar 18 12:32:20 crc kubenswrapper[4921]: E0318 12:32:20.817997 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e\": container with ID starting with a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e not found: ID does not exist" containerID="a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.818018 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e"} err="failed to get container status \"a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e\": rpc error: code = NotFound desc = could not find container \"a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e\": container with ID starting with a8d2c55a320cb0a18736ee380fb32fd37abcb3fca7a60a628eabe7ac61d73b5e not found: ID does not exist" Mar 18 12:32:20 crc kubenswrapper[4921]: I0318 12:32:20.823145 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-x44rz"] Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.248620 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" path="/var/lib/kubelet/pods/cc6a9d7f-27e2-4c00-8077-6e1044ad20af/volumes" Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.692065 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.719666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf79cb9db-9pf9t" event={"ID":"08667791-7c42-46d1-a74b-436dfefa5db3","Type":"ContainerStarted","Data":"cfefeafefb675bac464d8262a5b628863032fde78483031396ecf5c2c726f1af"} Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.719768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf79cb9db-9pf9t" event={"ID":"08667791-7c42-46d1-a74b-436dfefa5db3","Type":"ContainerStarted","Data":"199e0bb74b18c94897363ce6c49342390238bfcfe306768c0912bdc03eeb27b4"} Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.720167 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf79cb9db-9pf9t" event={"ID":"08667791-7c42-46d1-a74b-436dfefa5db3","Type":"ContainerStarted","Data":"178f666737bdc953ad23eeb783dc216db29acaa8bdb524a93feb51fa7e6e8fcd"} Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.720188 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.720203 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.722654 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nszlz" event={"ID":"0022dc9f-31d2-440f-831a-ae0a03c22b63","Type":"ContainerStarted","Data":"512062659d515017ee947ace7cc917182bfc83631f3814efe78fc925e4023714"} Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.725291 4921 generic.go:334] "Generic (PLEG): container finished" podID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerID="e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c" exitCode=2 Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.725324 4921 generic.go:334] "Generic (PLEG): container finished" podID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerID="5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0" exitCode=0 Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.725345 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerDied","Data":"e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c"} Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.725366 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerDied","Data":"5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0"} Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.743684 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-cf79cb9db-9pf9t" podStartSLOduration=6.743659549 podStartE2EDuration="6.743659549s" podCreationTimestamp="2026-03-18 12:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:21.734445408 +0000 UTC m=+1361.284366047" watchObservedRunningTime="2026-03-18 12:32:21.743659549 +0000 UTC m=+1361.293580188" Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.768559 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nszlz" podStartSLOduration=3.38303325 podStartE2EDuration="47.768537637s" podCreationTimestamp="2026-03-18 12:31:34 +0000 UTC" firstStartedPulling="2026-03-18 12:31:35.942008893 +0000 UTC m=+1315.491929542" lastFinishedPulling="2026-03-18 12:32:20.32751329 +0000 UTC m=+1359.877433929" observedRunningTime="2026-03-18 12:32:21.761578689 +0000 UTC m=+1361.311499328" watchObservedRunningTime="2026-03-18 12:32:21.768537637 +0000 UTC m=+1361.318458276" Mar 18 12:32:21 crc kubenswrapper[4921]: I0318 12:32:21.769904 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:22 crc kubenswrapper[4921]: I0318 12:32:22.744578 4921 generic.go:334] "Generic (PLEG): container finished" podID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerID="7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8" exitCode=0 Mar 18 12:32:22 crc kubenswrapper[4921]: I0318 12:32:22.744711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerDied","Data":"7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8"} Mar 18 12:32:26 crc kubenswrapper[4921]: I0318 12:32:26.788662 4921 generic.go:334] "Generic (PLEG): container finished" podID="0022dc9f-31d2-440f-831a-ae0a03c22b63" containerID="512062659d515017ee947ace7cc917182bfc83631f3814efe78fc925e4023714" exitCode=0 Mar 18 12:32:26 crc kubenswrapper[4921]: I0318 12:32:26.788718 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nszlz" event={"ID":"0022dc9f-31d2-440f-831a-ae0a03c22b63","Type":"ContainerDied","Data":"512062659d515017ee947ace7cc917182bfc83631f3814efe78fc925e4023714"} Mar 18 12:32:27 crc kubenswrapper[4921]: I0318 12:32:27.115627 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.142713 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nszlz" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250212 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-scripts\") pod \"0022dc9f-31d2-440f-831a-ae0a03c22b63\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2lpz\" (UniqueName: \"kubernetes.io/projected/0022dc9f-31d2-440f-831a-ae0a03c22b63-kube-api-access-j2lpz\") pod \"0022dc9f-31d2-440f-831a-ae0a03c22b63\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250341 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-combined-ca-bundle\") pod \"0022dc9f-31d2-440f-831a-ae0a03c22b63\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-config-data\") pod \"0022dc9f-31d2-440f-831a-ae0a03c22b63\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250447 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0022dc9f-31d2-440f-831a-ae0a03c22b63-etc-machine-id\") pod \"0022dc9f-31d2-440f-831a-ae0a03c22b63\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250484 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-db-sync-config-data\") pod \"0022dc9f-31d2-440f-831a-ae0a03c22b63\" (UID: \"0022dc9f-31d2-440f-831a-ae0a03c22b63\") " Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.250580 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0022dc9f-31d2-440f-831a-ae0a03c22b63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0022dc9f-31d2-440f-831a-ae0a03c22b63" (UID: "0022dc9f-31d2-440f-831a-ae0a03c22b63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.252657 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0022dc9f-31d2-440f-831a-ae0a03c22b63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.258270 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0022dc9f-31d2-440f-831a-ae0a03c22b63" (UID: "0022dc9f-31d2-440f-831a-ae0a03c22b63"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.258311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-scripts" (OuterVolumeSpecName: "scripts") pod "0022dc9f-31d2-440f-831a-ae0a03c22b63" (UID: "0022dc9f-31d2-440f-831a-ae0a03c22b63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.258350 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0022dc9f-31d2-440f-831a-ae0a03c22b63-kube-api-access-j2lpz" (OuterVolumeSpecName: "kube-api-access-j2lpz") pod "0022dc9f-31d2-440f-831a-ae0a03c22b63" (UID: "0022dc9f-31d2-440f-831a-ae0a03c22b63"). InnerVolumeSpecName "kube-api-access-j2lpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.281013 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0022dc9f-31d2-440f-831a-ae0a03c22b63" (UID: "0022dc9f-31d2-440f-831a-ae0a03c22b63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.315346 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-config-data" (OuterVolumeSpecName: "config-data") pod "0022dc9f-31d2-440f-831a-ae0a03c22b63" (UID: "0022dc9f-31d2-440f-831a-ae0a03c22b63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.354108 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.354168 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2lpz\" (UniqueName: \"kubernetes.io/projected/0022dc9f-31d2-440f-831a-ae0a03c22b63-kube-api-access-j2lpz\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.354183 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.354191 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.354200 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0022dc9f-31d2-440f-831a-ae0a03c22b63-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.582203 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.650712 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d84c7864-6hqj5"] Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.651066 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d84c7864-6hqj5" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api" containerID="cri-o://6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37" gracePeriod=30 Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.651425 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d84c7864-6hqj5" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api-log" containerID="cri-o://1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca" gracePeriod=30 Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.807240 4921 generic.go:334] "Generic (PLEG): container finished" podID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerID="1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca" exitCode=143 Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.807457 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d84c7864-6hqj5" event={"ID":"5af6c4d9-c843-47d6-b95e-a5e537231cc3","Type":"ContainerDied","Data":"1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca"} Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.809205 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nszlz" event={"ID":"0022dc9f-31d2-440f-831a-ae0a03c22b63","Type":"ContainerDied","Data":"d0e540fa388afd5c99aafa99dcb5c81f899ee7a7dddc7eebc405ae426ccd917d"} Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.809243 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e540fa388afd5c99aafa99dcb5c81f899ee7a7dddc7eebc405ae426ccd917d" Mar 18 12:32:28 crc kubenswrapper[4921]: I0318 12:32:28.809313 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nszlz" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.151202 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-phf9s"] Mar 18 12:32:29 crc kubenswrapper[4921]: E0318 12:32:29.151833 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerName="dnsmasq-dns" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.151851 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerName="dnsmasq-dns" Mar 18 12:32:29 crc kubenswrapper[4921]: E0318 12:32:29.151899 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerName="init" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.151908 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerName="init" Mar 18 12:32:29 crc kubenswrapper[4921]: E0318 12:32:29.151940 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0022dc9f-31d2-440f-831a-ae0a03c22b63" containerName="cinder-db-sync" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.151949 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0022dc9f-31d2-440f-831a-ae0a03c22b63" containerName="cinder-db-sync" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.152199 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0022dc9f-31d2-440f-831a-ae0a03c22b63" containerName="cinder-db-sync" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.152224 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6a9d7f-27e2-4c00-8077-6e1044ad20af" containerName="dnsmasq-dns" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.153580 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.164931 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-phf9s"] Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.178350 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.180218 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.190016 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.190263 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.190284 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v6kc7" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.190394 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.229981 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.277900 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngpq\" (UniqueName: \"kubernetes.io/projected/b6ba9a5c-c719-470a-b047-437082f292d6-kube-api-access-gngpq\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e09fc439-605a-4569-9481-7240199ee081-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nn2\" (UniqueName: \"kubernetes.io/projected/e09fc439-605a-4569-9481-7240199ee081-kube-api-access-d7nn2\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278103 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278150 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-config\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278239 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278286 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278360 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278409 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.278456 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-scripts\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.362781 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.364651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.371460 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.374169 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.379901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.379964 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-scripts\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngpq\" (UniqueName: \"kubernetes.io/projected/b6ba9a5c-c719-470a-b047-437082f292d6-kube-api-access-gngpq\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380095 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e09fc439-605a-4569-9481-7240199ee081-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380140 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nn2\" (UniqueName: \"kubernetes.io/projected/e09fc439-605a-4569-9481-7240199ee081-kube-api-access-d7nn2\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380179 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380206 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-config\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380245 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380280 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380309 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380330 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.380386 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.388618 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.388703 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e09fc439-605a-4569-9481-7240199ee081-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.389537 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.390150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.394919 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-config\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.396244 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.403760 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.416396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-scripts\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.417058 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.417506 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.429122 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nn2\" (UniqueName: \"kubernetes.io/projected/e09fc439-605a-4569-9481-7240199ee081-kube-api-access-d7nn2\") pod \"cinder-scheduler-0\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.436209 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngpq\" (UniqueName: \"kubernetes.io/projected/b6ba9a5c-c719-470a-b047-437082f292d6-kube-api-access-gngpq\") pod \"dnsmasq-dns-5c9776ccc5-phf9s\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.477172 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482142 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data-custom\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482209 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482272 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a67a2dd-13a0-4406-b202-f6d7f545d554-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482326 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstc8\" (UniqueName: \"kubernetes.io/projected/4a67a2dd-13a0-4406-b202-f6d7f545d554-kube-api-access-xstc8\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482366 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a67a2dd-13a0-4406-b202-f6d7f545d554-logs\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-scripts\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.482548 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.503617 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.584644 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-scripts\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585105 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585173 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data-custom\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585205 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585261 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a67a2dd-13a0-4406-b202-f6d7f545d554-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585326 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xstc8\" (UniqueName: \"kubernetes.io/projected/4a67a2dd-13a0-4406-b202-f6d7f545d554-kube-api-access-xstc8\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585364 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a67a2dd-13a0-4406-b202-f6d7f545d554-logs\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585812 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a67a2dd-13a0-4406-b202-f6d7f545d554-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.585864 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a67a2dd-13a0-4406-b202-f6d7f545d554-logs\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.592356 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data-custom\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.593317 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-scripts\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.594675 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.595385 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.609288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstc8\" (UniqueName: \"kubernetes.io/projected/4a67a2dd-13a0-4406-b202-f6d7f545d554-kube-api-access-xstc8\") pod \"cinder-api-0\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " pod="openstack/cinder-api-0" Mar 18 12:32:29 crc kubenswrapper[4921]: I0318 12:32:29.777266 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.098554 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.197766 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-phf9s"] Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.372887 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:30 crc kubenswrapper[4921]: W0318 12:32:30.380237 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a67a2dd_13a0_4406_b202_f6d7f545d554.slice/crio-7f7b55898939cb265b36c3ca75783492d26baa72b4d2bb3dfb9b7ca10696d927 WatchSource:0}: Error finding container 7f7b55898939cb265b36c3ca75783492d26baa72b4d2bb3dfb9b7ca10696d927: Status 404 returned error can't find the container with id 7f7b55898939cb265b36c3ca75783492d26baa72b4d2bb3dfb9b7ca10696d927 Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.836274 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a67a2dd-13a0-4406-b202-f6d7f545d554","Type":"ContainerStarted","Data":"7f7b55898939cb265b36c3ca75783492d26baa72b4d2bb3dfb9b7ca10696d927"} Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.841053 4921 generic.go:334] "Generic (PLEG): container finished" podID="b6ba9a5c-c719-470a-b047-437082f292d6" containerID="56cfbcdbf457274044cc596b4a976b5f0345edae0248288903ae6d4e5cdb0409" exitCode=0 Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.841134 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" event={"ID":"b6ba9a5c-c719-470a-b047-437082f292d6","Type":"ContainerDied","Data":"56cfbcdbf457274044cc596b4a976b5f0345edae0248288903ae6d4e5cdb0409"} Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.841162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" event={"ID":"b6ba9a5c-c719-470a-b047-437082f292d6","Type":"ContainerStarted","Data":"9954f490475fc861bd4f6ef8c65791bb210ef59a5589c5fc93f477b64832ea51"} Mar 18 12:32:30 crc kubenswrapper[4921]: I0318 12:32:30.850314 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e09fc439-605a-4569-9481-7240199ee081","Type":"ContainerStarted","Data":"3ede83b66903c9605225025c3ee415b9d25208a5f453a03d3f71ccd11d672526"} Mar 18 12:32:31 crc kubenswrapper[4921]: I0318 12:32:31.866407 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a67a2dd-13a0-4406-b202-f6d7f545d554","Type":"ContainerStarted","Data":"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73"} Mar 18 12:32:31 crc kubenswrapper[4921]: I0318 12:32:31.873677 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" event={"ID":"b6ba9a5c-c719-470a-b047-437082f292d6","Type":"ContainerStarted","Data":"c07da51ea5d42b20509a8bfcec21fdd4d030ff5c491ccc9255b87fb76712f053"} Mar 18 12:32:31 crc kubenswrapper[4921]: I0318 12:32:31.873861 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:31 crc kubenswrapper[4921]: I0318 12:32:31.899672 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" podStartSLOduration=2.89965583 podStartE2EDuration="2.89965583s" podCreationTimestamp="2026-03-18 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:31.892300881 +0000 UTC m=+1371.442221520" watchObservedRunningTime="2026-03-18 12:32:31.89965583 +0000 UTC m=+1371.449576469" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.229101 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d84c7864-6hqj5" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:44132->10.217.0.160:9311: read: connection reset by peer" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.229134 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d84c7864-6hqj5" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:44134->10.217.0.160:9311: read: connection reset by peer" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.335955 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.857468 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.944758 4921 generic.go:334] "Generic (PLEG): container finished" podID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerID="6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37" exitCode=0 Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.944822 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d84c7864-6hqj5" event={"ID":"5af6c4d9-c843-47d6-b95e-a5e537231cc3","Type":"ContainerDied","Data":"6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37"} Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.944856 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d84c7864-6hqj5" event={"ID":"5af6c4d9-c843-47d6-b95e-a5e537231cc3","Type":"ContainerDied","Data":"c7b3245d2ecc6ae23681be849412d66a1316ac34e1e21f1339d4c59840e8d448"} Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.944877 4921 scope.go:117] "RemoveContainer" containerID="6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.945027 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d84c7864-6hqj5" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.949244 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e09fc439-605a-4569-9481-7240199ee081","Type":"ContainerStarted","Data":"1eb6ebfba3ce6708e56a505a8edaacebf1f756a0a161dc1bc9295ed1d6b2be9a"} Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.955890 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a67a2dd-13a0-4406-b202-f6d7f545d554","Type":"ContainerStarted","Data":"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b"} Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.956136 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.961595 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data-custom\") pod \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.961694 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7t6g\" (UniqueName: \"kubernetes.io/projected/5af6c4d9-c843-47d6-b95e-a5e537231cc3-kube-api-access-m7t6g\") pod \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.961947 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data\") pod \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.962022 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af6c4d9-c843-47d6-b95e-a5e537231cc3-logs\") pod \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.962155 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-combined-ca-bundle\") pod \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\" (UID: \"5af6c4d9-c843-47d6-b95e-a5e537231cc3\") " Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.964228 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5af6c4d9-c843-47d6-b95e-a5e537231cc3-logs" (OuterVolumeSpecName: "logs") pod "5af6c4d9-c843-47d6-b95e-a5e537231cc3" (UID: "5af6c4d9-c843-47d6-b95e-a5e537231cc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.966301 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af6c4d9-c843-47d6-b95e-a5e537231cc3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.967259 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5af6c4d9-c843-47d6-b95e-a5e537231cc3" (UID: "5af6c4d9-c843-47d6-b95e-a5e537231cc3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.976641 4921 scope.go:117] "RemoveContainer" containerID="1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.980125 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.980067497 podStartE2EDuration="3.980067497s" podCreationTimestamp="2026-03-18 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:32.976452255 +0000 UTC m=+1372.526372894" watchObservedRunningTime="2026-03-18 12:32:32.980067497 +0000 UTC m=+1372.529988136" Mar 18 12:32:32 crc kubenswrapper[4921]: I0318 12:32:32.982545 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af6c4d9-c843-47d6-b95e-a5e537231cc3-kube-api-access-m7t6g" (OuterVolumeSpecName: "kube-api-access-m7t6g") pod "5af6c4d9-c843-47d6-b95e-a5e537231cc3" (UID: "5af6c4d9-c843-47d6-b95e-a5e537231cc3"). InnerVolumeSpecName "kube-api-access-m7t6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.034204 4921 scope.go:117] "RemoveContainer" containerID="6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37" Mar 18 12:32:33 crc kubenswrapper[4921]: E0318 12:32:33.034937 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37\": container with ID starting with 6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37 not found: ID does not exist" containerID="6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.035069 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37"} err="failed to get container status \"6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37\": rpc error: code = NotFound desc = could not find container \"6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37\": container with ID starting with 6ba9a7568c96e8c366cd6d269780a6a696835e2a15fdb92d91217e6f23d40a37 not found: ID does not exist" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.035276 4921 scope.go:117] "RemoveContainer" containerID="1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca" Mar 18 12:32:33 crc kubenswrapper[4921]: E0318 12:32:33.035676 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca\": container with ID starting with 1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca not found: ID does not exist" containerID="1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.036288 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca"} err="failed to get container status \"1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca\": rpc error: code = NotFound desc = could not find container \"1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca\": container with ID starting with 1fead00c0f51ab02b5947a3352d86531aa0429b661fbc28a4a937b54e473cdca not found: ID does not exist" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.041741 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data" (OuterVolumeSpecName: "config-data") pod "5af6c4d9-c843-47d6-b95e-a5e537231cc3" (UID: "5af6c4d9-c843-47d6-b95e-a5e537231cc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.047610 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5af6c4d9-c843-47d6-b95e-a5e537231cc3" (UID: "5af6c4d9-c843-47d6-b95e-a5e537231cc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.069231 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.069263 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.069274 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5af6c4d9-c843-47d6-b95e-a5e537231cc3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.069289 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7t6g\" (UniqueName: \"kubernetes.io/projected/5af6c4d9-c843-47d6-b95e-a5e537231cc3-kube-api-access-m7t6g\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.282710 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d84c7864-6hqj5"] Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.296407 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79d84c7864-6hqj5"] Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.965657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e09fc439-605a-4569-9481-7240199ee081","Type":"ContainerStarted","Data":"919e281fcb6315271dc3b2ca3372489b45d43b9d25aac2beb2c85495cb3a505c"} Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.967610 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api-log" containerID="cri-o://d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73" gracePeriod=30 Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.968138 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api" containerID="cri-o://121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b" gracePeriod=30 Mar 18 12:32:33 crc kubenswrapper[4921]: I0318 12:32:33.997405 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.46860562 podStartE2EDuration="4.997386243s" podCreationTimestamp="2026-03-18 12:32:29 +0000 UTC" firstStartedPulling="2026-03-18 12:32:30.196191429 +0000 UTC m=+1369.746112068" lastFinishedPulling="2026-03-18 12:32:31.724972052 +0000 UTC m=+1371.274892691" observedRunningTime="2026-03-18 12:32:33.984893968 +0000 UTC m=+1373.534814607" watchObservedRunningTime="2026-03-18 12:32:33.997386243 +0000 UTC m=+1373.547306882" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.504441 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.731456 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.797942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a67a2dd-13a0-4406-b202-f6d7f545d554-etc-machine-id\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798016 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xstc8\" (UniqueName: \"kubernetes.io/projected/4a67a2dd-13a0-4406-b202-f6d7f545d554-kube-api-access-xstc8\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798098 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a67a2dd-13a0-4406-b202-f6d7f545d554-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798097 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798188 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data-custom\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798227 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a67a2dd-13a0-4406-b202-f6d7f545d554-logs\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798319 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-combined-ca-bundle\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.798456 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-scripts\") pod \"4a67a2dd-13a0-4406-b202-f6d7f545d554\" (UID: \"4a67a2dd-13a0-4406-b202-f6d7f545d554\") " Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.799069 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a67a2dd-13a0-4406-b202-f6d7f545d554-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.799073 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a67a2dd-13a0-4406-b202-f6d7f545d554-logs" (OuterVolumeSpecName: "logs") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.810689 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a67a2dd-13a0-4406-b202-f6d7f545d554-kube-api-access-xstc8" (OuterVolumeSpecName: "kube-api-access-xstc8") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "kube-api-access-xstc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.813634 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.816914 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-scripts" (OuterVolumeSpecName: "scripts") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.845034 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.866283 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data" (OuterVolumeSpecName: "config-data") pod "4a67a2dd-13a0-4406-b202-f6d7f545d554" (UID: "4a67a2dd-13a0-4406-b202-f6d7f545d554"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.901078 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.901129 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xstc8\" (UniqueName: \"kubernetes.io/projected/4a67a2dd-13a0-4406-b202-f6d7f545d554-kube-api-access-xstc8\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.901143 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.901152 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.901164 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a67a2dd-13a0-4406-b202-f6d7f545d554-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.901171 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a67a2dd-13a0-4406-b202-f6d7f545d554-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981534 4921 generic.go:334] "Generic (PLEG): container finished" podID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerID="121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b" exitCode=0 Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981571 4921 generic.go:334] "Generic (PLEG): container finished" podID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerID="d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73" exitCode=143 Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981600 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981622 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a67a2dd-13a0-4406-b202-f6d7f545d554","Type":"ContainerDied","Data":"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b"} Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981670 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a67a2dd-13a0-4406-b202-f6d7f545d554","Type":"ContainerDied","Data":"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73"} Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981680 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a67a2dd-13a0-4406-b202-f6d7f545d554","Type":"ContainerDied","Data":"7f7b55898939cb265b36c3ca75783492d26baa72b4d2bb3dfb9b7ca10696d927"} Mar 18 12:32:34 crc kubenswrapper[4921]: I0318 12:32:34.981696 4921 scope.go:117] "RemoveContainer" containerID="121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.008181 4921 scope.go:117] "RemoveContainer" containerID="d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.021494 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.032147 4921 scope.go:117] "RemoveContainer" containerID="121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b" Mar 18 12:32:35 crc kubenswrapper[4921]: E0318 12:32:35.032663 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b\": container with ID starting with 121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b not found: ID does not exist" containerID="121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.032705 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b"} err="failed to get container status \"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b\": rpc error: code = NotFound desc = could not find container \"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b\": container with ID starting with 121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b not found: ID does not exist" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.032732 4921 scope.go:117] "RemoveContainer" containerID="d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73" Mar 18 12:32:35 crc kubenswrapper[4921]: E0318 12:32:35.035152 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73\": container with ID starting with d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73 not found: ID does not exist" containerID="d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.035621 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73"} err="failed to get container status \"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73\": rpc error: code = NotFound desc = could not find container \"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73\": container with ID starting with d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73 not found: ID does not exist" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.035658 4921 scope.go:117] "RemoveContainer" containerID="121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.036071 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b"} err="failed to get container status \"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b\": rpc error: code = NotFound desc = could not find container \"121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b\": container with ID starting with 121affa69576a38c0f9fb52f74b07ed71cf287766487072f9616d03362493c3b not found: ID does not exist" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.036088 4921 scope.go:117] "RemoveContainer" containerID="d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.038279 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73"} err="failed to get container status \"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73\": rpc error: code = NotFound desc = could not find container \"d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73\": container with ID starting with d150861a01c71cd88f58b2a80a0294b420d0445ac9fc72d63259bbcc7d4aed73 not found: ID does not exist" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.044570 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.055905 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:35 crc kubenswrapper[4921]: E0318 12:32:35.056770 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.056800 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api" Mar 18 12:32:35 crc kubenswrapper[4921]: E0318 12:32:35.056840 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api-log" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.056850 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api-log" Mar 18 12:32:35 crc kubenswrapper[4921]: E0318 12:32:35.056891 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.056900 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api" Mar 18 12:32:35 crc kubenswrapper[4921]: E0318 12:32:35.056910 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api-log" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.056918 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api-log" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.057179 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api-log" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.057203 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.057219 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" containerName="cinder-api-log" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.057230 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" containerName="barbican-api" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.058828 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.066201 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.066838 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.067367 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.069787 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105481 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-scripts\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105592 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b574f65d-9f59-41e3-bec6-59c25cc847fe-logs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105683 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data-custom\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105723 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b574f65d-9f59-41e3-bec6-59c25cc847fe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105766 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105806 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105831 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7g94\" (UniqueName: \"kubernetes.io/projected/b574f65d-9f59-41e3-bec6-59c25cc847fe-kube-api-access-t7g94\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.105887 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.135642 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.208523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b574f65d-9f59-41e3-bec6-59c25cc847fe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.208619 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.208680 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.208719 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7g94\" (UniqueName: \"kubernetes.io/projected/b574f65d-9f59-41e3-bec6-59c25cc847fe-kube-api-access-t7g94\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.208714 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b574f65d-9f59-41e3-bec6-59c25cc847fe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.208758 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.209167 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-scripts\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.209433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b574f65d-9f59-41e3-bec6-59c25cc847fe-logs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.209580 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.209782 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data-custom\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.210520 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b574f65d-9f59-41e3-bec6-59c25cc847fe-logs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.213521 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-scripts\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.214423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.214826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.215186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data-custom\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.215496 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.217688 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.226753 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a67a2dd-13a0-4406-b202-f6d7f545d554" path="/var/lib/kubelet/pods/4a67a2dd-13a0-4406-b202-f6d7f545d554/volumes" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.227880 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af6c4d9-c843-47d6-b95e-a5e537231cc3" path="/var/lib/kubelet/pods/5af6c4d9-c843-47d6-b95e-a5e537231cc3/volumes" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.227928 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7g94\" (UniqueName: \"kubernetes.io/projected/b574f65d-9f59-41e3-bec6-59c25cc847fe-kube-api-access-t7g94\") pod \"cinder-api-0\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.463739 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.953803 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:32:35 crc kubenswrapper[4921]: W0318 12:32:35.956340 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb574f65d_9f59_41e3_bec6_59c25cc847fe.slice/crio-784a1a51473fdd66b5391393d29a4e8b6dce82d46b9a9ae4fa0f022321ea4264 WatchSource:0}: Error finding container 784a1a51473fdd66b5391393d29a4e8b6dce82d46b9a9ae4fa0f022321ea4264: Status 404 returned error can't find the container with id 784a1a51473fdd66b5391393d29a4e8b6dce82d46b9a9ae4fa0f022321ea4264 Mar 18 12:32:35 crc kubenswrapper[4921]: I0318 12:32:35.994093 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b574f65d-9f59-41e3-bec6-59c25cc847fe","Type":"ContainerStarted","Data":"784a1a51473fdd66b5391393d29a4e8b6dce82d46b9a9ae4fa0f022321ea4264"} Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.084379 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vmlkw"] Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.087670 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.091339 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmlkw"] Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.148562 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxds\" (UniqueName: \"kubernetes.io/projected/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-kube-api-access-kdxds\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.148638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-utilities\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.148709 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-catalog-content\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.252738 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxds\" (UniqueName: \"kubernetes.io/projected/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-kube-api-access-kdxds\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.252824 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-utilities\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.252876 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-catalog-content\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.253360 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-utilities\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.254251 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-catalog-content\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.272010 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxds\" (UniqueName: \"kubernetes.io/projected/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-kube-api-access-kdxds\") pod \"community-operators-vmlkw\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.454002 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:36 crc kubenswrapper[4921]: I0318 12:32:36.945250 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmlkw"] Mar 18 12:32:37 crc kubenswrapper[4921]: I0318 12:32:37.016398 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b574f65d-9f59-41e3-bec6-59c25cc847fe","Type":"ContainerStarted","Data":"7309c213d8e4e3cb567c026e50b6cdc87298f82a2247b0633434b2e4b5e65f3c"} Mar 18 12:32:37 crc kubenswrapper[4921]: I0318 12:32:37.025912 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerStarted","Data":"643b1c5daa000753a2e0b4aaadfc77cd1e3cb3513cfdba679f1124e0a6149d4e"} Mar 18 12:32:38 crc kubenswrapper[4921]: I0318 12:32:38.040999 4921 generic.go:334] "Generic (PLEG): container finished" podID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerID="e9e7dac82d03b88207927832985decb08956e842ef2305729e31d3999509a41b" exitCode=0 Mar 18 12:32:38 crc kubenswrapper[4921]: I0318 12:32:38.041082 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerDied","Data":"e9e7dac82d03b88207927832985decb08956e842ef2305729e31d3999509a41b"} Mar 18 12:32:38 crc kubenswrapper[4921]: I0318 12:32:38.043876 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b574f65d-9f59-41e3-bec6-59c25cc847fe","Type":"ContainerStarted","Data":"d0425a7018f55d15fcb50cbcccfeff3feea433294ce78fd911a563de3145ed79"} Mar 18 12:32:38 crc kubenswrapper[4921]: I0318 12:32:38.044043 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 12:32:38 crc kubenswrapper[4921]: I0318 12:32:38.084003 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.083976409 podStartE2EDuration="3.083976409s" podCreationTimestamp="2026-03-18 12:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:38.083279709 +0000 UTC m=+1377.633200358" watchObservedRunningTime="2026-03-18 12:32:38.083976409 +0000 UTC m=+1377.633897038" Mar 18 12:32:38 crc kubenswrapper[4921]: I0318 12:32:38.909882 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.103084 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerStarted","Data":"4d5b46da32425f4b00cb922aa64ca6a8d7619f7670eb59d5b236eeef1b9cb4b6"} Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.479318 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.496557 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.571820 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-bjlc4"] Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.572131 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerName="dnsmasq-dns" containerID="cri-o://246cea279be0e6bb5e4e28a1065d801c827ce21596d6021eb95e25ada400d0f7" gracePeriod=10 Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.846741 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 12:32:39 crc kubenswrapper[4921]: I0318 12:32:39.896238 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.118797 4921 generic.go:334] "Generic (PLEG): container finished" podID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerID="4d5b46da32425f4b00cb922aa64ca6a8d7619f7670eb59d5b236eeef1b9cb4b6" exitCode=0 Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.118944 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerDied","Data":"4d5b46da32425f4b00cb922aa64ca6a8d7619f7670eb59d5b236eeef1b9cb4b6"} Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.128381 4921 generic.go:334] "Generic (PLEG): container finished" podID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerID="246cea279be0e6bb5e4e28a1065d801c827ce21596d6021eb95e25ada400d0f7" exitCode=0 Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.128614 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="cinder-scheduler" containerID="cri-o://1eb6ebfba3ce6708e56a505a8edaacebf1f756a0a161dc1bc9295ed1d6b2be9a" gracePeriod=30 Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.128710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" event={"ID":"f5295afc-4530-4c71-9671-19f54c2d73dc","Type":"ContainerDied","Data":"246cea279be0e6bb5e4e28a1065d801c827ce21596d6021eb95e25ada400d0f7"} Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.128778 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="probe" containerID="cri-o://919e281fcb6315271dc3b2ca3372489b45d43b9d25aac2beb2c85495cb3a505c" gracePeriod=30 Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.433251 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.625201 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.832018 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.987288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc48v\" (UniqueName: \"kubernetes.io/projected/f5295afc-4530-4c71-9671-19f54c2d73dc-kube-api-access-tc48v\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.987365 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-sb\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.987509 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-swift-storage-0\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.987569 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-config\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.987604 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-svc\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:40 crc kubenswrapper[4921]: I0318 12:32:40.988190 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.007736 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5295afc-4530-4c71-9671-19f54c2d73dc-kube-api-access-tc48v" (OuterVolumeSpecName: "kube-api-access-tc48v") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "kube-api-access-tc48v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.047703 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.055894 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-config" (OuterVolumeSpecName: "config") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.056413 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.087076 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.089745 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.090548 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb\") pod \"f5295afc-4530-4c71-9671-19f54c2d73dc\" (UID: \"f5295afc-4530-4c71-9671-19f54c2d73dc\") " Mar 18 12:32:41 crc kubenswrapper[4921]: W0318 12:32:41.090698 4921 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f5295afc-4530-4c71-9671-19f54c2d73dc/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.090726 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5295afc-4530-4c71-9671-19f54c2d73dc" (UID: "f5295afc-4530-4c71-9671-19f54c2d73dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.091092 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.091195 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.091215 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.091229 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.091240 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5295afc-4530-4c71-9671-19f54c2d73dc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.091251 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc48v\" (UniqueName: \"kubernetes.io/projected/f5295afc-4530-4c71-9671-19f54c2d73dc-kube-api-access-tc48v\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.144480 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" event={"ID":"f5295afc-4530-4c71-9671-19f54c2d73dc","Type":"ContainerDied","Data":"04e2505a5b93c207b8a1c7b96fc2937e24033c403229d184f6551d08722f72b3"} Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.144573 4921 scope.go:117] "RemoveContainer" containerID="246cea279be0e6bb5e4e28a1065d801c827ce21596d6021eb95e25ada400d0f7" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.144760 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-bjlc4" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.162556 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerStarted","Data":"64a7e5c6a784f2cbf9471ce705463e11a7b554250f3ac9765668e73809c9565d"} Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.191430 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vmlkw" podStartSLOduration=2.567611785 podStartE2EDuration="5.191410871s" podCreationTimestamp="2026-03-18 12:32:36 +0000 UTC" firstStartedPulling="2026-03-18 12:32:38.044498948 +0000 UTC m=+1377.594419587" lastFinishedPulling="2026-03-18 12:32:40.668298034 +0000 UTC m=+1380.218218673" observedRunningTime="2026-03-18 12:32:41.18995196 +0000 UTC m=+1380.739872599" watchObservedRunningTime="2026-03-18 12:32:41.191410871 +0000 UTC m=+1380.741331510" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.198124 4921 scope.go:117] "RemoveContainer" containerID="971947caeb1e521b0ff741bb7efd35261a154dc64dcdc7ac5547eec8e970acd4" Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.232751 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-bjlc4"] Mar 18 12:32:41 crc kubenswrapper[4921]: I0318 12:32:41.261576 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-bjlc4"] Mar 18 12:32:42 crc kubenswrapper[4921]: I0318 12:32:42.170566 4921 generic.go:334] "Generic (PLEG): container finished" podID="e09fc439-605a-4569-9481-7240199ee081" containerID="919e281fcb6315271dc3b2ca3372489b45d43b9d25aac2beb2c85495cb3a505c" exitCode=0 Mar 18 12:32:42 crc kubenswrapper[4921]: I0318 12:32:42.170636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e09fc439-605a-4569-9481-7240199ee081","Type":"ContainerDied","Data":"919e281fcb6315271dc3b2ca3372489b45d43b9d25aac2beb2c85495cb3a505c"} Mar 18 12:32:42 crc kubenswrapper[4921]: I0318 12:32:42.688818 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:32:42 crc kubenswrapper[4921]: I0318 12:32:42.758867 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58479fdccb-s9k88"] Mar 18 12:32:42 crc kubenswrapper[4921]: I0318 12:32:42.759270 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58479fdccb-s9k88" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-api" containerID="cri-o://6eb2bfc7a95745220c84b47f5d1b7fa05f6b59988bf8d3cfb91f23e2525cf1f5" gracePeriod=30 Mar 18 12:32:42 crc kubenswrapper[4921]: I0318 12:32:42.759400 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58479fdccb-s9k88" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-httpd" containerID="cri-o://05e8733e82208abf68bb03b99d7513198eaf58296f32a65d158dc2165f661c48" gracePeriod=30 Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.182042 4921 generic.go:334] "Generic (PLEG): container finished" podID="6d65e295-0914-4c05-bd33-fa99e512893e" containerID="05e8733e82208abf68bb03b99d7513198eaf58296f32a65d158dc2165f661c48" exitCode=0 Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.182128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58479fdccb-s9k88" event={"ID":"6d65e295-0914-4c05-bd33-fa99e512893e","Type":"ContainerDied","Data":"05e8733e82208abf68bb03b99d7513198eaf58296f32a65d158dc2165f661c48"} Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.220168 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" path="/var/lib/kubelet/pods/f5295afc-4530-4c71-9671-19f54c2d73dc/volumes" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.222594 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:43 crc kubenswrapper[4921]: E0318 12:32:43.222978 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerName="init" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.223000 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerName="init" Mar 18 12:32:43 crc kubenswrapper[4921]: E0318 12:32:43.223050 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerName="dnsmasq-dns" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.223059 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerName="dnsmasq-dns" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.223967 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5295afc-4530-4c71-9671-19f54c2d73dc" containerName="dnsmasq-dns" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.224774 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.227145 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.227390 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.227838 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7vlsz" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.232612 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.264689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.265018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.265273 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.265352 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzcmk\" (UniqueName: \"kubernetes.io/projected/5ad34d31-b27a-447b-8857-f0c34c2155b9-kube-api-access-lzcmk\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.367056 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.367421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzcmk\" (UniqueName: \"kubernetes.io/projected/5ad34d31-b27a-447b-8857-f0c34c2155b9-kube-api-access-lzcmk\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.367613 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.367745 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.368828 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.379860 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.379883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config-secret\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.390498 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzcmk\" (UniqueName: \"kubernetes.io/projected/5ad34d31-b27a-447b-8857-f0c34c2155b9-kube-api-access-lzcmk\") pod \"openstackclient\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.547344 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.696319 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.704471 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.747265 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.749311 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.765726 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.887080 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.887525 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.887548 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlpql\" (UniqueName: \"kubernetes.io/projected/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-kube-api-access-qlpql\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.887591 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.989063 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.989186 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.989207 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlpql\" (UniqueName: \"kubernetes.io/projected/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-kube-api-access-qlpql\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.989263 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:43 crc kubenswrapper[4921]: I0318 12:32:43.990159 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.001027 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.001092 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.014391 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlpql\" (UniqueName: \"kubernetes.io/projected/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-kube-api-access-qlpql\") pod \"openstackclient\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: E0318 12:32:44.088787 4921 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 12:32:44 crc kubenswrapper[4921]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_5ad34d31-b27a-447b-8857-f0c34c2155b9_0(fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532" Netns:"/var/run/netns/b0c92787-0195-491d-bc14-e1848d097911" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532;K8S_POD_UID=5ad34d31-b27a-447b-8857-f0c34c2155b9" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/5ad34d31-b27a-447b-8857-f0c34c2155b9:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532 network default NAD default] [openstack/openstackclient fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:a8 [10.217.0.168/23] Mar 18 12:32:44 crc kubenswrapper[4921]: ' Mar 18 12:32:44 crc kubenswrapper[4921]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 12:32:44 crc kubenswrapper[4921]: > Mar 18 12:32:44 crc kubenswrapper[4921]: E0318 12:32:44.088855 4921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 12:32:44 crc kubenswrapper[4921]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_5ad34d31-b27a-447b-8857-f0c34c2155b9_0(fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532" Netns:"/var/run/netns/b0c92787-0195-491d-bc14-e1848d097911" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532;K8S_POD_UID=5ad34d31-b27a-447b-8857-f0c34c2155b9" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/5ad34d31-b27a-447b-8857-f0c34c2155b9:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532 network default NAD default] [openstack/openstackclient fa14375c3e0b66b0a977b32eaf08cfc90110e3104979c18391d71dc5c1668532 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:a8 [10.217.0.168/23] Mar 18 12:32:44 crc kubenswrapper[4921]: ' Mar 18 12:32:44 crc kubenswrapper[4921]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 12:32:44 crc kubenswrapper[4921]: > pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.093317 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.217961 4921 generic.go:334] "Generic (PLEG): container finished" podID="e09fc439-605a-4569-9481-7240199ee081" containerID="1eb6ebfba3ce6708e56a505a8edaacebf1f756a0a161dc1bc9295ed1d6b2be9a" exitCode=0 Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.218080 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.218934 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e09fc439-605a-4569-9481-7240199ee081","Type":"ContainerDied","Data":"1eb6ebfba3ce6708e56a505a8edaacebf1f756a0a161dc1bc9295ed1d6b2be9a"} Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.223042 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5ad34d31-b27a-447b-8857-f0c34c2155b9" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.243208 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.399352 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzcmk\" (UniqueName: \"kubernetes.io/projected/5ad34d31-b27a-447b-8857-f0c34c2155b9-kube-api-access-lzcmk\") pod \"5ad34d31-b27a-447b-8857-f0c34c2155b9\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.399723 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-combined-ca-bundle\") pod \"5ad34d31-b27a-447b-8857-f0c34c2155b9\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.399882 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config\") pod \"5ad34d31-b27a-447b-8857-f0c34c2155b9\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.399907 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config-secret\") pod \"5ad34d31-b27a-447b-8857-f0c34c2155b9\" (UID: \"5ad34d31-b27a-447b-8857-f0c34c2155b9\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.402021 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5ad34d31-b27a-447b-8857-f0c34c2155b9" (UID: "5ad34d31-b27a-447b-8857-f0c34c2155b9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.407128 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad34d31-b27a-447b-8857-f0c34c2155b9-kube-api-access-lzcmk" (OuterVolumeSpecName: "kube-api-access-lzcmk") pod "5ad34d31-b27a-447b-8857-f0c34c2155b9" (UID: "5ad34d31-b27a-447b-8857-f0c34c2155b9"). InnerVolumeSpecName "kube-api-access-lzcmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.413051 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ad34d31-b27a-447b-8857-f0c34c2155b9" (UID: "5ad34d31-b27a-447b-8857-f0c34c2155b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.413168 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5ad34d31-b27a-447b-8857-f0c34c2155b9" (UID: "5ad34d31-b27a-447b-8857-f0c34c2155b9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.474326 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.502430 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzcmk\" (UniqueName: \"kubernetes.io/projected/5ad34d31-b27a-447b-8857-f0c34c2155b9-kube-api-access-lzcmk\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.502470 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.502482 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.502493 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5ad34d31-b27a-447b-8857-f0c34c2155b9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.603750 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7nn2\" (UniqueName: \"kubernetes.io/projected/e09fc439-605a-4569-9481-7240199ee081-kube-api-access-d7nn2\") pod \"e09fc439-605a-4569-9481-7240199ee081\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.604164 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data\") pod \"e09fc439-605a-4569-9481-7240199ee081\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.604328 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-scripts\") pod \"e09fc439-605a-4569-9481-7240199ee081\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.605008 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data-custom\") pod \"e09fc439-605a-4569-9481-7240199ee081\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.605189 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e09fc439-605a-4569-9481-7240199ee081-etc-machine-id\") pod \"e09fc439-605a-4569-9481-7240199ee081\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.605431 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-combined-ca-bundle\") pod \"e09fc439-605a-4569-9481-7240199ee081\" (UID: \"e09fc439-605a-4569-9481-7240199ee081\") " Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.607157 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e09fc439-605a-4569-9481-7240199ee081-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e09fc439-605a-4569-9481-7240199ee081" (UID: "e09fc439-605a-4569-9481-7240199ee081"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.609916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e09fc439-605a-4569-9481-7240199ee081" (UID: "e09fc439-605a-4569-9481-7240199ee081"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.609978 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-scripts" (OuterVolumeSpecName: "scripts") pod "e09fc439-605a-4569-9481-7240199ee081" (UID: "e09fc439-605a-4569-9481-7240199ee081"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.613256 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09fc439-605a-4569-9481-7240199ee081-kube-api-access-d7nn2" (OuterVolumeSpecName: "kube-api-access-d7nn2") pod "e09fc439-605a-4569-9481-7240199ee081" (UID: "e09fc439-605a-4569-9481-7240199ee081"). InnerVolumeSpecName "kube-api-access-d7nn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.670908 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e09fc439-605a-4569-9481-7240199ee081" (UID: "e09fc439-605a-4569-9481-7240199ee081"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.692507 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.707264 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.707583 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7nn2\" (UniqueName: \"kubernetes.io/projected/e09fc439-605a-4569-9481-7240199ee081-kube-api-access-d7nn2\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.707598 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.707609 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.707621 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e09fc439-605a-4569-9481-7240199ee081-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.724181 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data" (OuterVolumeSpecName: "config-data") pod "e09fc439-605a-4569-9481-7240199ee081" (UID: "e09fc439-605a-4569-9481-7240199ee081"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:44 crc kubenswrapper[4921]: I0318 12:32:44.809662 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e09fc439-605a-4569-9481-7240199ee081-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.222201 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad34d31-b27a-447b-8857-f0c34c2155b9" path="/var/lib/kubelet/pods/5ad34d31-b27a-447b-8857-f0c34c2155b9/volumes" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.228614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e09fc439-605a-4569-9481-7240199ee081","Type":"ContainerDied","Data":"3ede83b66903c9605225025c3ee415b9d25208a5f453a03d3f71ccd11d672526"} Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.228678 4921 scope.go:117] "RemoveContainer" containerID="919e281fcb6315271dc3b2ca3372489b45d43b9d25aac2beb2c85495cb3a505c" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.228813 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.233144 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.233174 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bc025b71-4b10-41e1-bccf-0d67a9b36b0f","Type":"ContainerStarted","Data":"678be6a05c8f4c232a35d24aee667cf43dc0cb7e98fd68b1b0dfcdc10a2919fb"} Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.265793 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5ad34d31-b27a-447b-8857-f0c34c2155b9" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.280603 4921 scope.go:117] "RemoveContainer" containerID="1eb6ebfba3ce6708e56a505a8edaacebf1f756a0a161dc1bc9295ed1d6b2be9a" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.312603 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.338738 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.352164 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:45 crc kubenswrapper[4921]: E0318 12:32:45.352627 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="cinder-scheduler" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.352650 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="cinder-scheduler" Mar 18 12:32:45 crc kubenswrapper[4921]: E0318 12:32:45.352683 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="probe" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.352692 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="probe" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.352940 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="probe" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.352968 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09fc439-605a-4569-9481-7240199ee081" containerName="cinder-scheduler" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.354029 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.358775 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.359181 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.429418 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/bf1cfbea-21b1-4f20-95c2-22c07304789c-kube-api-access-zgnmt\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.429482 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.429558 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf1cfbea-21b1-4f20-95c2-22c07304789c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.429643 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.429710 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-scripts\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.429764 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532038 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-scripts\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532147 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532221 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/bf1cfbea-21b1-4f20-95c2-22c07304789c-kube-api-access-zgnmt\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532249 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf1cfbea-21b1-4f20-95c2-22c07304789c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532377 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.532602 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf1cfbea-21b1-4f20-95c2-22c07304789c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.537461 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.538504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.542255 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-scripts\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.563254 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.564404 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/bf1cfbea-21b1-4f20-95c2-22c07304789c-kube-api-access-zgnmt\") pod \"cinder-scheduler-0\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " pod="openstack/cinder-scheduler-0" Mar 18 12:32:45 crc kubenswrapper[4921]: I0318 12:32:45.681914 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:32:46 crc kubenswrapper[4921]: I0318 12:32:46.321829 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:32:46 crc kubenswrapper[4921]: I0318 12:32:46.454580 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:46 crc kubenswrapper[4921]: I0318 12:32:46.455419 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.081479 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.081994 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.082043 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.082867 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ecb2d426426fe45a8d3167724569351edcb20678eccb43a43192be5b68165da4"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.082925 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://ecb2d426426fe45a8d3167724569351edcb20678eccb43a43192be5b68165da4" gracePeriod=600 Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.243291 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09fc439-605a-4569-9481-7240199ee081" path="/var/lib/kubelet/pods/e09fc439-605a-4569-9481-7240199ee081/volumes" Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.263943 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bf1cfbea-21b1-4f20-95c2-22c07304789c","Type":"ContainerStarted","Data":"b6f6eac2f1ae64a2931ebab6341ccdde8403127e4d28bf10d8ffd9bcbe4a8ea3"} Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.264303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bf1cfbea-21b1-4f20-95c2-22c07304789c","Type":"ContainerStarted","Data":"20e2d01249d508dac8a4060bfbea3c12d079d255dbbda323862516c8c4613cb0"} Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.266445 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="ecb2d426426fe45a8d3167724569351edcb20678eccb43a43192be5b68165da4" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.266510 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"ecb2d426426fe45a8d3167724569351edcb20678eccb43a43192be5b68165da4"} Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.266555 4921 scope.go:117] "RemoveContainer" containerID="b0b15d604734e663af7f5ab441b134e0458c09c7238f9cd112cf51b089408bef" Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.533896 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vmlkw" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="registry-server" probeResult="failure" output=< Mar 18 12:32:47 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 12:32:47 crc kubenswrapper[4921]: > Mar 18 12:32:47 crc kubenswrapper[4921]: I0318 12:32:47.783474 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.223269 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f8c57d85f-gdqrq"] Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.232078 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.236634 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.236818 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.236900 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.241413 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f8c57d85f-gdqrq"] Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.292806 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-config-data\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.292872 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xpf6\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-kube-api-access-5xpf6\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.292968 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-run-httpd\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.293014 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-etc-swift\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.293064 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-public-tls-certs\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.293091 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-internal-tls-certs\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.293132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-log-httpd\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.293155 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-combined-ca-bundle\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.308705 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bf1cfbea-21b1-4f20-95c2-22c07304789c","Type":"ContainerStarted","Data":"ba3db590f738d1f28baa164d30603e96d6981b683ac84eb657fd01aa1145a053"} Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.310839 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd"} Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.341799 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.341780451 podStartE2EDuration="3.341780451s" podCreationTimestamp="2026-03-18 12:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:48.335036639 +0000 UTC m=+1387.884957288" watchObservedRunningTime="2026-03-18 12:32:48.341780451 +0000 UTC m=+1387.891701090" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.395003 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-config-data\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.395074 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xpf6\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-kube-api-access-5xpf6\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.395176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-run-httpd\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.395225 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-etc-swift\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.395313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-public-tls-certs\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.395537 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-internal-tls-certs\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.398012 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-log-httpd\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.398048 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-combined-ca-bundle\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.397196 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-run-httpd\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.398976 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-log-httpd\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.403749 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-internal-tls-certs\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.406758 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-config-data\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.407075 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-etc-swift\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.409097 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-public-tls-certs\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.412265 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-combined-ca-bundle\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.431440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xpf6\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-kube-api-access-5xpf6\") pod \"swift-proxy-7f8c57d85f-gdqrq\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:48 crc kubenswrapper[4921]: I0318 12:32:48.557069 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:49 crc kubenswrapper[4921]: I0318 12:32:49.239512 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f8c57d85f-gdqrq"] Mar 18 12:32:49 crc kubenswrapper[4921]: I0318 12:32:49.348392 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" event={"ID":"870b5852-0790-4d4a-a0c1-df7789287b36","Type":"ContainerStarted","Data":"99e18b3ef98f53aa6c585067bbad4419a8c051b09b8e3f0ba8c5bd0ad16cdcf5"} Mar 18 12:32:50 crc kubenswrapper[4921]: I0318 12:32:50.358232 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" event={"ID":"870b5852-0790-4d4a-a0c1-df7789287b36","Type":"ContainerStarted","Data":"8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5"} Mar 18 12:32:50 crc kubenswrapper[4921]: I0318 12:32:50.358803 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" event={"ID":"870b5852-0790-4d4a-a0c1-df7789287b36","Type":"ContainerStarted","Data":"f7216c7a86cd828cfa2701494b171fae9e8224c04fb41a1372f7e7cb90e5cf3a"} Mar 18 12:32:50 crc kubenswrapper[4921]: I0318 12:32:50.358825 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:50 crc kubenswrapper[4921]: I0318 12:32:50.358835 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:50 crc kubenswrapper[4921]: I0318 12:32:50.381387 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" podStartSLOduration=2.381365624 podStartE2EDuration="2.381365624s" podCreationTimestamp="2026-03-18 12:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:50.380477019 +0000 UTC m=+1389.930397668" watchObservedRunningTime="2026-03-18 12:32:50.381365624 +0000 UTC m=+1389.931286263" Mar 18 12:32:50 crc kubenswrapper[4921]: I0318 12:32:50.682010 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 12:32:50 crc kubenswrapper[4921]: E0318 12:32:50.819202 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7ae8e5b_8419_4e0d_a1c7_b8fe2bc0eb78.slice/crio-92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.328387 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.384970 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-log-httpd\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.385150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwcqq\" (UniqueName: \"kubernetes.io/projected/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-kube-api-access-bwcqq\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.385226 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-combined-ca-bundle\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.385243 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-config-data\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.385311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-run-httpd\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.385364 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-scripts\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.385423 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-sg-core-conf-yaml\") pod \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\" (UID: \"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78\") " Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.389827 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.391597 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.412312 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-scripts" (OuterVolumeSpecName: "scripts") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.426453 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-kube-api-access-bwcqq" (OuterVolumeSpecName: "kube-api-access-bwcqq") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "kube-api-access-bwcqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.447323 4921 generic.go:334] "Generic (PLEG): container finished" podID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerID="92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9" exitCode=137 Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.447736 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.447945 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerDied","Data":"92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9"} Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.448090 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78","Type":"ContainerDied","Data":"77e3c960b24d096d19356c1028545fc660992459cebfcd16320749d615d8a19f"} Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.448163 4921 scope.go:117] "RemoveContainer" containerID="92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.464426 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.488008 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwcqq\" (UniqueName: \"kubernetes.io/projected/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-kube-api-access-bwcqq\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.488136 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.489328 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.489346 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.489356 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.496928 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.522357 4921 scope.go:117] "RemoveContainer" containerID="e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.527899 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-config-data" (OuterVolumeSpecName: "config-data") pod "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" (UID: "d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.566718 4921 scope.go:117] "RemoveContainer" containerID="7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.591711 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.591751 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.606025 4921 scope.go:117] "RemoveContainer" containerID="5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.633362 4921 scope.go:117] "RemoveContainer" containerID="92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.633869 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9\": container with ID starting with 92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9 not found: ID does not exist" containerID="92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.633930 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9"} err="failed to get container status \"92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9\": rpc error: code = NotFound desc = could not find container \"92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9\": container with ID starting with 92de8668c853d2c45433ceda335ea0b89b31176c1839f43f8ed0d05d59b146a9 not found: ID does not exist" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.633962 4921 scope.go:117] "RemoveContainer" containerID="e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.634496 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c\": container with ID starting with e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c not found: ID does not exist" containerID="e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.634537 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c"} err="failed to get container status \"e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c\": rpc error: code = NotFound desc = could not find container \"e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c\": container with ID starting with e6fed92df1821296afd33ecb608b7ba9ae12165bbf3a27cfa3b1f764b423153c not found: ID does not exist" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.634565 4921 scope.go:117] "RemoveContainer" containerID="7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.634917 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8\": container with ID starting with 7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8 not found: ID does not exist" containerID="7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.634962 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8"} err="failed to get container status \"7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8\": rpc error: code = NotFound desc = could not find container \"7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8\": container with ID starting with 7f62f2c2a2dca579bbfcdfafeac5d709ea1f1ec6c71e659692b531a3c38960a8 not found: ID does not exist" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.634990 4921 scope.go:117] "RemoveContainer" containerID="5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.635507 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0\": container with ID starting with 5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0 not found: ID does not exist" containerID="5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.635541 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0"} err="failed to get container status \"5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0\": rpc error: code = NotFound desc = could not find container \"5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0\": container with ID starting with 5385db806377ec32eb9e7edb9929440c1a3d8b2bd3f9767aa55a459f71a7dda0 not found: ID does not exist" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.783598 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.792789 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823285 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.823631 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="sg-core" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823648 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="sg-core" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.823667 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-central-agent" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823673 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-central-agent" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.823683 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="proxy-httpd" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823689 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="proxy-httpd" Mar 18 12:32:51 crc kubenswrapper[4921]: E0318 12:32:51.823704 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-notification-agent" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823710 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-notification-agent" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823880 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="proxy-httpd" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823910 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-central-agent" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823924 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="ceilometer-notification-agent" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.823939 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" containerName="sg-core" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.825513 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.829738 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.832054 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.843003 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.900806 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-config-data\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.900964 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.901002 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-log-httpd\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.901036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.901131 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-scripts\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.901171 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vdl\" (UniqueName: \"kubernetes.io/projected/5249cc52-d843-4717-9182-85a5d617b94c-kube-api-access-m6vdl\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:51 crc kubenswrapper[4921]: I0318 12:32:51.901219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-run-httpd\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.002954 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-scripts\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vdl\" (UniqueName: \"kubernetes.io/projected/5249cc52-d843-4717-9182-85a5d617b94c-kube-api-access-m6vdl\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003039 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-run-httpd\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003131 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-config-data\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003208 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003235 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-log-httpd\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003261 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003962 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-log-httpd\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.003958 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-run-httpd\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.007535 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-scripts\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.009936 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.013755 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.021675 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vdl\" (UniqueName: \"kubernetes.io/projected/5249cc52-d843-4717-9182-85a5d617b94c-kube-api-access-m6vdl\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.027590 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-config-data\") pod \"ceilometer-0\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.208385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:32:52 crc kubenswrapper[4921]: W0318 12:32:52.709663 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5249cc52_d843_4717_9182_85a5d617b94c.slice/crio-cee963c66792f67bd4d8cccf723e350d7286d54257d715331ec92aa359db8fa4 WatchSource:0}: Error finding container cee963c66792f67bd4d8cccf723e350d7286d54257d715331ec92aa359db8fa4: Status 404 returned error can't find the container with id cee963c66792f67bd4d8cccf723e350d7286d54257d715331ec92aa359db8fa4 Mar 18 12:32:52 crc kubenswrapper[4921]: I0318 12:32:52.713495 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:53 crc kubenswrapper[4921]: I0318 12:32:53.222488 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78" path="/var/lib/kubelet/pods/d7ae8e5b-8419-4e0d-a1c7-b8fe2bc0eb78/volumes" Mar 18 12:32:53 crc kubenswrapper[4921]: I0318 12:32:53.436211 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:32:53 crc kubenswrapper[4921]: I0318 12:32:53.478161 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerStarted","Data":"cee963c66792f67bd4d8cccf723e350d7286d54257d715331ec92aa359db8fa4"} Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.503641 4921 generic.go:334] "Generic (PLEG): container finished" podID="6d65e295-0914-4c05-bd33-fa99e512893e" containerID="6eb2bfc7a95745220c84b47f5d1b7fa05f6b59988bf8d3cfb91f23e2525cf1f5" exitCode=0 Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.503762 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58479fdccb-s9k88" event={"ID":"6d65e295-0914-4c05-bd33-fa99e512893e","Type":"ContainerDied","Data":"6eb2bfc7a95745220c84b47f5d1b7fa05f6b59988bf8d3cfb91f23e2525cf1f5"} Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.541804 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qhdnd"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.543623 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.551564 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-operator-scripts\") pod \"nova-api-db-create-qhdnd\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.551638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qnc\" (UniqueName: \"kubernetes.io/projected/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-kube-api-access-c6qnc\") pod \"nova-api-db-create-qhdnd\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.558420 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qhdnd"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.629374 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9n4lr"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.631220 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.642930 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6c41-account-create-update-qsnfn"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.645130 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.653758 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6d8c16c-799a-4946-80d8-5db5566e76c9-operator-scripts\") pod \"nova-cell0-db-create-9n4lr\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.653825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qnc\" (UniqueName: \"kubernetes.io/projected/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-kube-api-access-c6qnc\") pod \"nova-api-db-create-qhdnd\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.653990 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmdr\" (UniqueName: \"kubernetes.io/projected/b6d8c16c-799a-4946-80d8-5db5566e76c9-kube-api-access-9gmdr\") pod \"nova-cell0-db-create-9n4lr\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.654045 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-operator-scripts\") pod \"nova-api-6c41-account-create-update-qsnfn\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.654081 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqms\" (UniqueName: \"kubernetes.io/projected/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-kube-api-access-5cqms\") pod \"nova-api-6c41-account-create-update-qsnfn\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.654179 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-operator-scripts\") pod \"nova-api-db-create-qhdnd\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.655448 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-operator-scripts\") pod \"nova-api-db-create-qhdnd\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.657177 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.674985 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9n4lr"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.683297 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c41-account-create-update-qsnfn"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.690399 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qnc\" (UniqueName: \"kubernetes.io/projected/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-kube-api-access-c6qnc\") pod \"nova-api-db-create-qhdnd\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.758400 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmdr\" (UniqueName: \"kubernetes.io/projected/b6d8c16c-799a-4946-80d8-5db5566e76c9-kube-api-access-9gmdr\") pod \"nova-cell0-db-create-9n4lr\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.758477 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-operator-scripts\") pod \"nova-api-6c41-account-create-update-qsnfn\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.758510 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqms\" (UniqueName: \"kubernetes.io/projected/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-kube-api-access-5cqms\") pod \"nova-api-6c41-account-create-update-qsnfn\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.758619 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6d8c16c-799a-4946-80d8-5db5566e76c9-operator-scripts\") pod \"nova-cell0-db-create-9n4lr\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.759570 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6d8c16c-799a-4946-80d8-5db5566e76c9-operator-scripts\") pod \"nova-cell0-db-create-9n4lr\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.762485 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-operator-scripts\") pod \"nova-api-6c41-account-create-update-qsnfn\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.780772 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqms\" (UniqueName: \"kubernetes.io/projected/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-kube-api-access-5cqms\") pod \"nova-api-6c41-account-create-update-qsnfn\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.782673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmdr\" (UniqueName: \"kubernetes.io/projected/b6d8c16c-799a-4946-80d8-5db5566e76c9-kube-api-access-9gmdr\") pod \"nova-cell0-db-create-9n4lr\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.849314 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jt4qq"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.851808 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.854337 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-aea5-account-create-update-glsg4"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.855142 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.859915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68l8\" (UniqueName: \"kubernetes.io/projected/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-kube-api-access-b68l8\") pod \"nova-cell1-db-create-jt4qq\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.859969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-operator-scripts\") pod \"nova-cell1-db-create-jt4qq\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.860812 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jt4qq"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.878059 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.884739 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.889729 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aea5-account-create-update-glsg4"] Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.972453 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.985808 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68l8\" (UniqueName: \"kubernetes.io/projected/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-kube-api-access-b68l8\") pod \"nova-cell1-db-create-jt4qq\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.985973 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-operator-scripts\") pod \"nova-cell1-db-create-jt4qq\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.987089 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-operator-scripts\") pod \"nova-cell1-db-create-jt4qq\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:54 crc kubenswrapper[4921]: I0318 12:32:54.987809 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.006164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68l8\" (UniqueName: \"kubernetes.io/projected/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-kube-api-access-b68l8\") pod \"nova-cell1-db-create-jt4qq\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.046960 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3a84-account-create-update-9plfj"] Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.048754 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.052630 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.065198 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3a84-account-create-update-9plfj"] Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.089437 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85f3503b-9ffc-4667-b9f5-9a4880895948-operator-scripts\") pod \"nova-cell0-aea5-account-create-update-glsg4\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.089725 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkkj\" (UniqueName: \"kubernetes.io/projected/85f3503b-9ffc-4667-b9f5-9a4880895948-kube-api-access-pmkkj\") pod \"nova-cell0-aea5-account-create-update-glsg4\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.190968 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85f3503b-9ffc-4667-b9f5-9a4880895948-operator-scripts\") pod \"nova-cell0-aea5-account-create-update-glsg4\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.191091 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkkj\" (UniqueName: \"kubernetes.io/projected/85f3503b-9ffc-4667-b9f5-9a4880895948-kube-api-access-pmkkj\") pod \"nova-cell0-aea5-account-create-update-glsg4\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.191146 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-operator-scripts\") pod \"nova-cell1-3a84-account-create-update-9plfj\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.191198 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqhxm\" (UniqueName: \"kubernetes.io/projected/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-kube-api-access-vqhxm\") pod \"nova-cell1-3a84-account-create-update-9plfj\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.191769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85f3503b-9ffc-4667-b9f5-9a4880895948-operator-scripts\") pod \"nova-cell0-aea5-account-create-update-glsg4\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.214155 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkkj\" (UniqueName: \"kubernetes.io/projected/85f3503b-9ffc-4667-b9f5-9a4880895948-kube-api-access-pmkkj\") pod \"nova-cell0-aea5-account-create-update-glsg4\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.216460 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.222870 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.293203 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-operator-scripts\") pod \"nova-cell1-3a84-account-create-update-9plfj\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.293336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqhxm\" (UniqueName: \"kubernetes.io/projected/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-kube-api-access-vqhxm\") pod \"nova-cell1-3a84-account-create-update-9plfj\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.294133 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-operator-scripts\") pod \"nova-cell1-3a84-account-create-update-9plfj\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.309787 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqhxm\" (UniqueName: \"kubernetes.io/projected/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-kube-api-access-vqhxm\") pod \"nova-cell1-3a84-account-create-update-9plfj\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:55 crc kubenswrapper[4921]: I0318 12:32:55.411831 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.057928 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.516800 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.575398 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.727533 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.727783 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-log" containerID="cri-o://34a5c49d33b50f9daecefad0b2a32f7081ca17c826b1bebb75452011680a9f66" gracePeriod=30 Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.727916 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-httpd" containerID="cri-o://b8750eede6490ad6a6c09658966fb05cac1bc82d03658becf1a1c604c33190d2" gracePeriod=30 Mar 18 12:32:56 crc kubenswrapper[4921]: I0318 12:32:56.766194 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmlkw"] Mar 18 12:32:57 crc kubenswrapper[4921]: I0318 12:32:57.553348 4921 generic.go:334] "Generic (PLEG): container finished" podID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerID="34a5c49d33b50f9daecefad0b2a32f7081ca17c826b1bebb75452011680a9f66" exitCode=143 Mar 18 12:32:57 crc kubenswrapper[4921]: I0318 12:32:57.553674 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"879edc6c-5a15-4316-9f8f-58bcf8d87b95","Type":"ContainerDied","Data":"34a5c49d33b50f9daecefad0b2a32f7081ca17c826b1bebb75452011680a9f66"} Mar 18 12:32:57 crc kubenswrapper[4921]: I0318 12:32:57.613000 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:32:57 crc kubenswrapper[4921]: I0318 12:32:57.613414 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-httpd" containerID="cri-o://9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509" gracePeriod=30 Mar 18 12:32:57 crc kubenswrapper[4921]: I0318 12:32:57.613695 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-log" containerID="cri-o://6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9" gracePeriod=30 Mar 18 12:32:58 crc kubenswrapper[4921]: I0318 12:32:58.563757 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:58 crc kubenswrapper[4921]: I0318 12:32:58.565062 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:32:58 crc kubenswrapper[4921]: I0318 12:32:58.585518 4921 generic.go:334] "Generic (PLEG): container finished" podID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerID="6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9" exitCode=143 Mar 18 12:32:58 crc kubenswrapper[4921]: I0318 12:32:58.585729 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vmlkw" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="registry-server" containerID="cri-o://64a7e5c6a784f2cbf9471ce705463e11a7b554250f3ac9765668e73809c9565d" gracePeriod=2 Mar 18 12:32:58 crc kubenswrapper[4921]: I0318 12:32:58.585783 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f4a58e9-3870-4b79-bbb6-6ec610898b96","Type":"ContainerDied","Data":"6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9"} Mar 18 12:32:59 crc kubenswrapper[4921]: I0318 12:32:59.577954 4921 scope.go:117] "RemoveContainer" containerID="ed442339ea68f65084a0d26f50effbf3acc23f10f1455f1be7d0dfb1631dfec6" Mar 18 12:32:59 crc kubenswrapper[4921]: I0318 12:32:59.607533 4921 generic.go:334] "Generic (PLEG): container finished" podID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerID="64a7e5c6a784f2cbf9471ce705463e11a7b554250f3ac9765668e73809c9565d" exitCode=0 Mar 18 12:32:59 crc kubenswrapper[4921]: I0318 12:32:59.607603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerDied","Data":"64a7e5c6a784f2cbf9471ce705463e11a7b554250f3ac9765668e73809c9565d"} Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.501015 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.633884 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-catalog-content\") pod \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.634264 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-utilities\") pod \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.634358 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdxds\" (UniqueName: \"kubernetes.io/projected/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-kube-api-access-kdxds\") pod \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\" (UID: \"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.635155 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-utilities" (OuterVolumeSpecName: "utilities") pod "02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" (UID: "02d6ceca-c377-42d4-b7f6-d5a9a083bbe7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.643566 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-kube-api-access-kdxds" (OuterVolumeSpecName: "kube-api-access-kdxds") pod "02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" (UID: "02d6ceca-c377-42d4-b7f6-d5a9a083bbe7"). InnerVolumeSpecName "kube-api-access-kdxds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.656228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmlkw" event={"ID":"02d6ceca-c377-42d4-b7f6-d5a9a083bbe7","Type":"ContainerDied","Data":"643b1c5daa000753a2e0b4aaadfc77cd1e3cb3513cfdba679f1124e0a6149d4e"} Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.656311 4921 scope.go:117] "RemoveContainer" containerID="64a7e5c6a784f2cbf9471ce705463e11a7b554250f3ac9765668e73809c9565d" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.656560 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmlkw" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.674473 4921 generic.go:334] "Generic (PLEG): container finished" podID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerID="b8750eede6490ad6a6c09658966fb05cac1bc82d03658becf1a1c604c33190d2" exitCode=0 Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.674593 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"879edc6c-5a15-4316-9f8f-58bcf8d87b95","Type":"ContainerDied","Data":"b8750eede6490ad6a6c09658966fb05cac1bc82d03658becf1a1c604c33190d2"} Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.711828 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" (UID: "02d6ceca-c377-42d4-b7f6-d5a9a083bbe7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.737577 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdxds\" (UniqueName: \"kubernetes.io/projected/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-kube-api-access-kdxds\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.737621 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.737635 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.739646 4921 scope.go:117] "RemoveContainer" containerID="4d5b46da32425f4b00cb922aa64ca6a8d7619f7670eb59d5b236eeef1b9cb4b6" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.791594 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.838324 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sszsn\" (UniqueName: \"kubernetes.io/projected/6d65e295-0914-4c05-bd33-fa99e512893e-kube-api-access-sszsn\") pod \"6d65e295-0914-4c05-bd33-fa99e512893e\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.838375 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-config\") pod \"6d65e295-0914-4c05-bd33-fa99e512893e\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.838412 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-ovndb-tls-certs\") pod \"6d65e295-0914-4c05-bd33-fa99e512893e\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.838494 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-combined-ca-bundle\") pod \"6d65e295-0914-4c05-bd33-fa99e512893e\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.838529 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-httpd-config\") pod \"6d65e295-0914-4c05-bd33-fa99e512893e\" (UID: \"6d65e295-0914-4c05-bd33-fa99e512893e\") " Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.847531 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d65e295-0914-4c05-bd33-fa99e512893e-kube-api-access-sszsn" (OuterVolumeSpecName: "kube-api-access-sszsn") pod "6d65e295-0914-4c05-bd33-fa99e512893e" (UID: "6d65e295-0914-4c05-bd33-fa99e512893e"). InnerVolumeSpecName "kube-api-access-sszsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.856900 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6d65e295-0914-4c05-bd33-fa99e512893e" (UID: "6d65e295-0914-4c05-bd33-fa99e512893e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.940793 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.941165 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sszsn\" (UniqueName: \"kubernetes.io/projected/6d65e295-0914-4c05-bd33-fa99e512893e-kube-api-access-sszsn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:00 crc kubenswrapper[4921]: I0318 12:33:00.968395 4921 scope.go:117] "RemoveContainer" containerID="e9e7dac82d03b88207927832985decb08956e842ef2305729e31d3999509a41b" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.044717 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d65e295-0914-4c05-bd33-fa99e512893e" (UID: "6d65e295-0914-4c05-bd33-fa99e512893e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.055301 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-config" (OuterVolumeSpecName: "config") pod "6d65e295-0914-4c05-bd33-fa99e512893e" (UID: "6d65e295-0914-4c05-bd33-fa99e512893e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.091190 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6d65e295-0914-4c05-bd33-fa99e512893e" (UID: "6d65e295-0914-4c05-bd33-fa99e512893e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.144676 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.144715 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.144728 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d65e295-0914-4c05-bd33-fa99e512893e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.270674 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.278372 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmlkw"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.315334 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vmlkw"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455044 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455129 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-config-data\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455175 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qpwd\" (UniqueName: \"kubernetes.io/projected/879edc6c-5a15-4316-9f8f-58bcf8d87b95-kube-api-access-4qpwd\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455224 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-httpd-run\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455264 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-combined-ca-bundle\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455305 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-internal-tls-certs\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455417 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-scripts\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.455474 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-logs\") pod \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\" (UID: \"879edc6c-5a15-4316-9f8f-58bcf8d87b95\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.456371 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-logs" (OuterVolumeSpecName: "logs") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.459423 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.463541 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.468775 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879edc6c-5a15-4316-9f8f-58bcf8d87b95-kube-api-access-4qpwd" (OuterVolumeSpecName: "kube-api-access-4qpwd") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "kube-api-access-4qpwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.476318 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-scripts" (OuterVolumeSpecName: "scripts") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.495777 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.543322 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.553570 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.556989 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.557020 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qpwd\" (UniqueName: \"kubernetes.io/projected/879edc6c-5a15-4316-9f8f-58bcf8d87b95-kube-api-access-4qpwd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.557031 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.557039 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.557049 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.557058 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.557070 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/879edc6c-5a15-4316-9f8f-58bcf8d87b95-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.558743 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-config-data" (OuterVolumeSpecName: "config-data") pod "879edc6c-5a15-4316-9f8f-58bcf8d87b95" (UID: "879edc6c-5a15-4316-9f8f-58bcf8d87b95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.601387 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.657886 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-scripts\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659331 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvd2b\" (UniqueName: \"kubernetes.io/projected/9f4a58e9-3870-4b79-bbb6-6ec610898b96-kube-api-access-mvd2b\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659371 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659460 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-config-data\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659489 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-combined-ca-bundle\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659520 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-public-tls-certs\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659619 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-httpd-run\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.659650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-logs\") pod \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\" (UID: \"9f4a58e9-3870-4b79-bbb6-6ec610898b96\") " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.660285 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.660309 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/879edc6c-5a15-4316-9f8f-58bcf8d87b95-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.661518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-logs" (OuterVolumeSpecName: "logs") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.666508 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.670577 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jt4qq"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.674245 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-scripts" (OuterVolumeSpecName: "scripts") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.674522 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4a58e9-3870-4b79-bbb6-6ec610898b96-kube-api-access-mvd2b" (OuterVolumeSpecName: "kube-api-access-mvd2b") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "kube-api-access-mvd2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: W0318 12:33:01.684923 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d8c16c_799a_4946_80d8_5db5566e76c9.slice/crio-5cf3103802bc0210cbb11eef24324920edb9c408a9b92b79ca1abb0f6047354d WatchSource:0}: Error finding container 5cf3103802bc0210cbb11eef24324920edb9c408a9b92b79ca1abb0f6047354d: Status 404 returned error can't find the container with id 5cf3103802bc0210cbb11eef24324920edb9c408a9b92b79ca1abb0f6047354d Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.688741 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9n4lr"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.705610 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aea5-account-create-update-glsg4"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.710917 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58479fdccb-s9k88" event={"ID":"6d65e295-0914-4c05-bd33-fa99e512893e","Type":"ContainerDied","Data":"052423c1801110e93c59e4be562d37675607074a3c291981273b371af276de2d"} Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.710984 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58479fdccb-s9k88" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.711158 4921 scope.go:117] "RemoveContainer" containerID="05e8733e82208abf68bb03b99d7513198eaf58296f32a65d158dc2165f661c48" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.717867 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.718387 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3a84-account-create-update-9plfj"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.721719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bc025b71-4b10-41e1-bccf-0d67a9b36b0f","Type":"ContainerStarted","Data":"5a943b023196d78dd7c6404da7d91670ba6f169429e0a23456b547a01a996a57"} Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.731334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerStarted","Data":"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba"} Mar 18 12:33:01 crc kubenswrapper[4921]: W0318 12:33:01.732062 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85f3503b_9ffc_4667_b9f5_9a4880895948.slice/crio-e69b069aa8b8c7cf8b4c7698b2eefe1005a8cc12a7b1da2cec47e5b71c9cc031 WatchSource:0}: Error finding container e69b069aa8b8c7cf8b4c7698b2eefe1005a8cc12a7b1da2cec47e5b71c9cc031: Status 404 returned error can't find the container with id e69b069aa8b8c7cf8b4c7698b2eefe1005a8cc12a7b1da2cec47e5b71c9cc031 Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.733431 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jt4qq" event={"ID":"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9","Type":"ContainerStarted","Data":"36187e7b1a8c327f019902a46a075f12f6955a7f57f629de23649e452d6ddbc0"} Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.761729 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.761763 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f4a58e9-3870-4b79-bbb6-6ec610898b96-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.761774 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.761785 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvd2b\" (UniqueName: \"kubernetes.io/projected/9f4a58e9-3870-4b79-bbb6-6ec610898b96-kube-api-access-mvd2b\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.761811 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.765087 4921 scope.go:117] "RemoveContainer" containerID="6eb2bfc7a95745220c84b47f5d1b7fa05f6b59988bf8d3cfb91f23e2525cf1f5" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.767726 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qhdnd"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.778616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"879edc6c-5a15-4316-9f8f-58bcf8d87b95","Type":"ContainerDied","Data":"d33a6316a4c3fe7068742d37db1f80050b6301b34b172dd6d19c9e6e6cd08b88"} Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.778716 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.789262 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c41-account-create-update-qsnfn"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.794584 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.798482 4921 generic.go:334] "Generic (PLEG): container finished" podID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerID="9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509" exitCode=0 Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.798541 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f4a58e9-3870-4b79-bbb6-6ec610898b96","Type":"ContainerDied","Data":"9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509"} Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.798576 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f4a58e9-3870-4b79-bbb6-6ec610898b96","Type":"ContainerDied","Data":"1c88dd00f26cc529669a6320c09b03e3bbd8747345b20fcfd4d183ea8ba4060b"} Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.798883 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.806907 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58479fdccb-s9k88"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.820477 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-58479fdccb-s9k88"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.836821 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.254462213 podStartE2EDuration="18.836364098s" podCreationTimestamp="2026-03-18 12:32:43 +0000 UTC" firstStartedPulling="2026-03-18 12:32:44.703399807 +0000 UTC m=+1384.253320446" lastFinishedPulling="2026-03-18 12:33:00.285301692 +0000 UTC m=+1399.835222331" observedRunningTime="2026-03-18 12:33:01.767352759 +0000 UTC m=+1401.317273398" watchObservedRunningTime="2026-03-18 12:33:01.836364098 +0000 UTC m=+1401.386284757" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.868456 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.869766 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.885378 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.895823 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896292 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896309 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896335 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-api" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896343 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-api" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896355 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896363 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896380 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-log" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896387 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-log" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896396 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="extract-utilities" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896404 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="extract-utilities" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896412 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896419 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896426 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-log" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896433 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-log" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896450 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="registry-server" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896457 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="registry-server" Mar 18 12:33:01 crc kubenswrapper[4921]: E0318 12:33:01.896478 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="extract-content" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.896485 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="extract-content" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.897971 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-api" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.897995 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" containerName="neutron-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.898008 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.898021 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" containerName="registry-server" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.898064 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-log" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.898090 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" containerName="glance-log" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.898132 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-httpd" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.900014 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.903018 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.903294 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.906347 4921 scope.go:117] "RemoveContainer" containerID="b8750eede6490ad6a6c09658966fb05cac1bc82d03658becf1a1c604c33190d2" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.924073 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.950640 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-config-data" (OuterVolumeSpecName: "config-data") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.953753 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.966056 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f4a58e9-3870-4b79-bbb6-6ec610898b96" (UID: "9f4a58e9-3870-4b79-bbb6-6ec610898b96"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.973078 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.973115 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.973128 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f4a58e9-3870-4b79-bbb6-6ec610898b96-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:01 crc kubenswrapper[4921]: I0318 12:33:01.989553 4921 scope.go:117] "RemoveContainer" containerID="34a5c49d33b50f9daecefad0b2a32f7081ca17c826b1bebb75452011680a9f66" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074421 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsgv\" (UniqueName: \"kubernetes.io/projected/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-kube-api-access-2rsgv\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074755 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074789 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074875 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074931 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.074969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176063 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176128 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176242 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176264 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176285 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176395 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.176424 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsgv\" (UniqueName: \"kubernetes.io/projected/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-kube-api-access-2rsgv\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.177265 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.179812 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.180248 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.189846 4921 scope.go:117] "RemoveContainer" containerID="9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.199121 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.217785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.231105 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.232031 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsgv\" (UniqueName: \"kubernetes.io/projected/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-kube-api-access-2rsgv\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.232232 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.235500 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.249415 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.281308 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.285687 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.289515 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.293853 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.297462 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.324011 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.372072 4921 scope.go:117] "RemoveContainer" containerID="6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387300 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-config-data\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387385 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-logs\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387468 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387543 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9dt\" (UniqueName: \"kubernetes.io/projected/778f8baf-82ce-457d-b32d-35d3abe1a79d-kube-api-access-5j9dt\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387725 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387798 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-scripts\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387896 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.387984 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.491138 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-logs\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.491642 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.491790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9dt\" (UniqueName: \"kubernetes.io/projected/778f8baf-82ce-457d-b32d-35d3abe1a79d-kube-api-access-5j9dt\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.491901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.492006 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-scripts\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.492190 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.492323 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.492516 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-config-data\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.493419 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.492320 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-logs\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.493985 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.498640 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-config-data\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.499806 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.500456 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.502359 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-scripts\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.512194 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9dt\" (UniqueName: \"kubernetes.io/projected/778f8baf-82ce-457d-b32d-35d3abe1a79d-kube-api-access-5j9dt\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.521245 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.526450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.652182 4921 scope.go:117] "RemoveContainer" containerID="9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509" Mar 18 12:33:02 crc kubenswrapper[4921]: E0318 12:33:02.660706 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509\": container with ID starting with 9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509 not found: ID does not exist" containerID="9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.660751 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509"} err="failed to get container status \"9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509\": rpc error: code = NotFound desc = could not find container \"9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509\": container with ID starting with 9054ecd9b2b7588f51d37399128ac037013235f7ad0dc18d7ce6ea47dc95d509 not found: ID does not exist" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.660817 4921 scope.go:117] "RemoveContainer" containerID="6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9" Mar 18 12:33:02 crc kubenswrapper[4921]: E0318 12:33:02.661210 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9\": container with ID starting with 6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9 not found: ID does not exist" containerID="6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.661229 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9"} err="failed to get container status \"6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9\": rpc error: code = NotFound desc = could not find container \"6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9\": container with ID starting with 6bbc0b8b5a367ba1bfa2ea885da7ae40406b33dc2a11c34a694a90b200ec8af9 not found: ID does not exist" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.709902 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.841035 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" event={"ID":"85f3503b-9ffc-4667-b9f5-9a4880895948","Type":"ContainerStarted","Data":"8ae61aa8c057c79dfa1422f040343df978c73fafb418d65005dcbeffb56a0b62"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.841092 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" event={"ID":"85f3503b-9ffc-4667-b9f5-9a4880895948","Type":"ContainerStarted","Data":"e69b069aa8b8c7cf8b4c7698b2eefe1005a8cc12a7b1da2cec47e5b71c9cc031"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.853919 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jt4qq" event={"ID":"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9","Type":"ContainerStarted","Data":"f4a7f718a064ff5327ef0699ff8ee174589413ea084ad26555e7c4fd891adbf4"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.872362 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerStarted","Data":"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.890908 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" podStartSLOduration=8.890884119999999 podStartE2EDuration="8.89088412s" podCreationTimestamp="2026-03-18 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:02.871468739 +0000 UTC m=+1402.421389448" watchObservedRunningTime="2026-03-18 12:33:02.89088412 +0000 UTC m=+1402.440804759" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.905764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qhdnd" event={"ID":"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8","Type":"ContainerStarted","Data":"f9b427283c3c62291529dbbfa222883688637448a5c073c793a49f2cb67e24a5"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.906239 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qhdnd" event={"ID":"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8","Type":"ContainerStarted","Data":"9a01baf9c11816ceec4cf994180e04b040fa71907abaabd6e99b2d25f487d90c"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.907335 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-jt4qq" podStartSLOduration=8.907193783 podStartE2EDuration="8.907193783s" podCreationTimestamp="2026-03-18 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:02.892924948 +0000 UTC m=+1402.442845597" watchObservedRunningTime="2026-03-18 12:33:02.907193783 +0000 UTC m=+1402.457114422" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.909315 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9n4lr" event={"ID":"b6d8c16c-799a-4946-80d8-5db5566e76c9","Type":"ContainerStarted","Data":"fcd04e6ba101c226ce486909cafb42e88b51add5c7c331c1698a093284551f03"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.909364 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9n4lr" event={"ID":"b6d8c16c-799a-4946-80d8-5db5566e76c9","Type":"ContainerStarted","Data":"5cf3103802bc0210cbb11eef24324920edb9c408a9b92b79ca1abb0f6047354d"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.933405 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c41-account-create-update-qsnfn" event={"ID":"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb","Type":"ContainerStarted","Data":"02b675be8b0d3793c287ddec2ea292e0fefd05e85f65de2310afd3b338a8549c"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.933476 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c41-account-create-update-qsnfn" event={"ID":"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb","Type":"ContainerStarted","Data":"2e06bc4bdbcb0653442e8ab77db38d9c6ba94c2a48d393d9af89d93f1b10f988"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.936238 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-qhdnd" podStartSLOduration=8.936206657 podStartE2EDuration="8.936206657s" podCreationTimestamp="2026-03-18 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:02.927731506 +0000 UTC m=+1402.477652145" watchObservedRunningTime="2026-03-18 12:33:02.936206657 +0000 UTC m=+1402.486127296" Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.940236 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" event={"ID":"f3477f12-9377-4146-a6a0-1cf5fa1c9dae","Type":"ContainerStarted","Data":"cac3aa7baca1c481ff2d6c82cc731c0849f2635fd4ed78439f67735301e095b3"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.940318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" event={"ID":"f3477f12-9377-4146-a6a0-1cf5fa1c9dae","Type":"ContainerStarted","Data":"c5b4cced64471507f017acda1956b6badafc2046862fb5d3a0471c0a48f16fd0"} Mar 18 12:33:02 crc kubenswrapper[4921]: I0318 12:33:02.971758 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-9n4lr" podStartSLOduration=8.971734175 podStartE2EDuration="8.971734175s" podCreationTimestamp="2026-03-18 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:02.95924112 +0000 UTC m=+1402.509161759" watchObservedRunningTime="2026-03-18 12:33:02.971734175 +0000 UTC m=+1402.521654814" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.039448 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6c41-account-create-update-qsnfn" podStartSLOduration=9.039427186 podStartE2EDuration="9.039427186s" podCreationTimestamp="2026-03-18 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:02.981137292 +0000 UTC m=+1402.531057921" watchObservedRunningTime="2026-03-18 12:33:03.039427186 +0000 UTC m=+1402.589347825" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.045075 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" podStartSLOduration=9.045056576 podStartE2EDuration="9.045056576s" podCreationTimestamp="2026-03-18 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:03.013013747 +0000 UTC m=+1402.562934386" watchObservedRunningTime="2026-03-18 12:33:03.045056576 +0000 UTC m=+1402.594977215" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.234364 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d6ceca-c377-42d4-b7f6-d5a9a083bbe7" path="/var/lib/kubelet/pods/02d6ceca-c377-42d4-b7f6-d5a9a083bbe7/volumes" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.236830 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d65e295-0914-4c05-bd33-fa99e512893e" path="/var/lib/kubelet/pods/6d65e295-0914-4c05-bd33-fa99e512893e/volumes" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.237801 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" path="/var/lib/kubelet/pods/879edc6c-5a15-4316-9f8f-58bcf8d87b95/volumes" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.241290 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4a58e9-3870-4b79-bbb6-6ec610898b96" path="/var/lib/kubelet/pods/9f4a58e9-3870-4b79-bbb6-6ec610898b96/volumes" Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.427083 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:03 crc kubenswrapper[4921]: W0318 12:33:03.446903 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ada97c3_d3f9_4fd7_9aeb_5ac7d45b46bf.slice/crio-380a8908656a348eaf0d21e91db7964db5cd144c4269305029bc41611a373e8f WatchSource:0}: Error finding container 380a8908656a348eaf0d21e91db7964db5cd144c4269305029bc41611a373e8f: Status 404 returned error can't find the container with id 380a8908656a348eaf0d21e91db7964db5cd144c4269305029bc41611a373e8f Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.527233 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.949703 4921 generic.go:334] "Generic (PLEG): container finished" podID="9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" containerID="02b675be8b0d3793c287ddec2ea292e0fefd05e85f65de2310afd3b338a8549c" exitCode=0 Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.949755 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c41-account-create-update-qsnfn" event={"ID":"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb","Type":"ContainerDied","Data":"02b675be8b0d3793c287ddec2ea292e0fefd05e85f65de2310afd3b338a8549c"} Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.954671 4921 generic.go:334] "Generic (PLEG): container finished" podID="85f3503b-9ffc-4667-b9f5-9a4880895948" containerID="8ae61aa8c057c79dfa1422f040343df978c73fafb418d65005dcbeffb56a0b62" exitCode=0 Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.954757 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" event={"ID":"85f3503b-9ffc-4667-b9f5-9a4880895948","Type":"ContainerDied","Data":"8ae61aa8c057c79dfa1422f040343df978c73fafb418d65005dcbeffb56a0b62"} Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.961594 4921 generic.go:334] "Generic (PLEG): container finished" podID="2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" containerID="f9b427283c3c62291529dbbfa222883688637448a5c073c793a49f2cb67e24a5" exitCode=0 Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.961724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qhdnd" event={"ID":"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8","Type":"ContainerDied","Data":"f9b427283c3c62291529dbbfa222883688637448a5c073c793a49f2cb67e24a5"} Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.972025 4921 generic.go:334] "Generic (PLEG): container finished" podID="b6d8c16c-799a-4946-80d8-5db5566e76c9" containerID="fcd04e6ba101c226ce486909cafb42e88b51add5c7c331c1698a093284551f03" exitCode=0 Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.972299 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9n4lr" event={"ID":"b6d8c16c-799a-4946-80d8-5db5566e76c9","Type":"ContainerDied","Data":"fcd04e6ba101c226ce486909cafb42e88b51add5c7c331c1698a093284551f03"} Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.976078 4921 generic.go:334] "Generic (PLEG): container finished" podID="f3477f12-9377-4146-a6a0-1cf5fa1c9dae" containerID="cac3aa7baca1c481ff2d6c82cc731c0849f2635fd4ed78439f67735301e095b3" exitCode=0 Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.976401 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" event={"ID":"f3477f12-9377-4146-a6a0-1cf5fa1c9dae","Type":"ContainerDied","Data":"cac3aa7baca1c481ff2d6c82cc731c0849f2635fd4ed78439f67735301e095b3"} Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.978372 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"778f8baf-82ce-457d-b32d-35d3abe1a79d","Type":"ContainerStarted","Data":"106bdca30eb650df18be845da90fb4c97ae19396f0dffcbef1e8c42102f9c63c"} Mar 18 12:33:03 crc kubenswrapper[4921]: I0318 12:33:03.982470 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf","Type":"ContainerStarted","Data":"380a8908656a348eaf0d21e91db7964db5cd144c4269305029bc41611a373e8f"} Mar 18 12:33:04 crc kubenswrapper[4921]: I0318 12:33:04.004588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerStarted","Data":"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c"} Mar 18 12:33:04 crc kubenswrapper[4921]: I0318 12:33:04.007520 4921 generic.go:334] "Generic (PLEG): container finished" podID="405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" containerID="f4a7f718a064ff5327ef0699ff8ee174589413ea084ad26555e7c4fd891adbf4" exitCode=0 Mar 18 12:33:04 crc kubenswrapper[4921]: I0318 12:33:04.007601 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jt4qq" event={"ID":"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9","Type":"ContainerDied","Data":"f4a7f718a064ff5327ef0699ff8ee174589413ea084ad26555e7c4fd891adbf4"} Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.019625 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf","Type":"ContainerStarted","Data":"db9a0c1811b401c4f15334f71054371c07cdb06d6ce632ece0b7217159b060b3"} Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.019700 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf","Type":"ContainerStarted","Data":"ed60a39c31c959721c7809f4880462dd279124305bcfe7b9ac91a6dcce1bd85e"} Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.022550 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"778f8baf-82ce-457d-b32d-35d3abe1a79d","Type":"ContainerStarted","Data":"404fbdad0acbc101609fe321e0ebd443a1518297cc849df46edbfb574ac4328a"} Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.022596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"778f8baf-82ce-457d-b32d-35d3abe1a79d","Type":"ContainerStarted","Data":"e282050c14e51eefd65b0a5667448f8285cab09cc7f7c0ec5267fa01ddcbb423"} Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.048900 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.048875714 podStartE2EDuration="4.048875714s" podCreationTimestamp="2026-03-18 12:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:05.042897495 +0000 UTC m=+1404.592818134" watchObservedRunningTime="2026-03-18 12:33:05.048875714 +0000 UTC m=+1404.598796353" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.072379 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.072358061 podStartE2EDuration="3.072358061s" podCreationTimestamp="2026-03-18 12:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:05.069427998 +0000 UTC m=+1404.619348627" watchObservedRunningTime="2026-03-18 12:33:05.072358061 +0000 UTC m=+1404.622278700" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.463808 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.599783 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85f3503b-9ffc-4667-b9f5-9a4880895948-operator-scripts\") pod \"85f3503b-9ffc-4667-b9f5-9a4880895948\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.600211 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmkkj\" (UniqueName: \"kubernetes.io/projected/85f3503b-9ffc-4667-b9f5-9a4880895948-kube-api-access-pmkkj\") pod \"85f3503b-9ffc-4667-b9f5-9a4880895948\" (UID: \"85f3503b-9ffc-4667-b9f5-9a4880895948\") " Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.606812 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f3503b-9ffc-4667-b9f5-9a4880895948-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85f3503b-9ffc-4667-b9f5-9a4880895948" (UID: "85f3503b-9ffc-4667-b9f5-9a4880895948"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.608097 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f3503b-9ffc-4667-b9f5-9a4880895948-kube-api-access-pmkkj" (OuterVolumeSpecName: "kube-api-access-pmkkj") pod "85f3503b-9ffc-4667-b9f5-9a4880895948" (UID: "85f3503b-9ffc-4667-b9f5-9a4880895948"). InnerVolumeSpecName "kube-api-access-pmkkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.702259 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85f3503b-9ffc-4667-b9f5-9a4880895948-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.702286 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmkkj\" (UniqueName: \"kubernetes.io/projected/85f3503b-9ffc-4667-b9f5-9a4880895948-kube-api-access-pmkkj\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.828896 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.841437 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.869293 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.889182 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:33:05 crc kubenswrapper[4921]: I0318 12:33:05.899434 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.005737 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-operator-scripts\") pod \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006042 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqhxm\" (UniqueName: \"kubernetes.io/projected/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-kube-api-access-vqhxm\") pod \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006101 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6qnc\" (UniqueName: \"kubernetes.io/projected/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-kube-api-access-c6qnc\") pod \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\" (UID: \"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006175 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cqms\" (UniqueName: \"kubernetes.io/projected/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-kube-api-access-5cqms\") pod \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006213 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gmdr\" (UniqueName: \"kubernetes.io/projected/b6d8c16c-799a-4946-80d8-5db5566e76c9-kube-api-access-9gmdr\") pod \"b6d8c16c-799a-4946-80d8-5db5566e76c9\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006238 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68l8\" (UniqueName: \"kubernetes.io/projected/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-kube-api-access-b68l8\") pod \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006269 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6d8c16c-799a-4946-80d8-5db5566e76c9-operator-scripts\") pod \"b6d8c16c-799a-4946-80d8-5db5566e76c9\" (UID: \"b6d8c16c-799a-4946-80d8-5db5566e76c9\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006333 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-operator-scripts\") pod \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\" (UID: \"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006367 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-operator-scripts\") pod \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\" (UID: \"f3477f12-9377-4146-a6a0-1cf5fa1c9dae\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.006521 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-operator-scripts\") pod \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\" (UID: \"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.007351 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" (UID: "9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.007577 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.007853 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d8c16c-799a-4946-80d8-5db5566e76c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6d8c16c-799a-4946-80d8-5db5566e76c9" (UID: "b6d8c16c-799a-4946-80d8-5db5566e76c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.007866 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3477f12-9377-4146-a6a0-1cf5fa1c9dae" (UID: "f3477f12-9377-4146-a6a0-1cf5fa1c9dae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.007915 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" (UID: "405b0dbf-0f32-499a-a71f-9ed82f5eb6a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.008282 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" (UID: "2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.010594 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-kube-api-access-b68l8" (OuterVolumeSpecName: "kube-api-access-b68l8") pod "405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" (UID: "405b0dbf-0f32-499a-a71f-9ed82f5eb6a9"). InnerVolumeSpecName "kube-api-access-b68l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.010653 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-kube-api-access-c6qnc" (OuterVolumeSpecName: "kube-api-access-c6qnc") pod "2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" (UID: "2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8"). InnerVolumeSpecName "kube-api-access-c6qnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.011582 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-kube-api-access-5cqms" (OuterVolumeSpecName: "kube-api-access-5cqms") pod "9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" (UID: "9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb"). InnerVolumeSpecName "kube-api-access-5cqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.012384 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d8c16c-799a-4946-80d8-5db5566e76c9-kube-api-access-9gmdr" (OuterVolumeSpecName: "kube-api-access-9gmdr") pod "b6d8c16c-799a-4946-80d8-5db5566e76c9" (UID: "b6d8c16c-799a-4946-80d8-5db5566e76c9"). InnerVolumeSpecName "kube-api-access-9gmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.012741 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-kube-api-access-vqhxm" (OuterVolumeSpecName: "kube-api-access-vqhxm") pod "f3477f12-9377-4146-a6a0-1cf5fa1c9dae" (UID: "f3477f12-9377-4146-a6a0-1cf5fa1c9dae"). InnerVolumeSpecName "kube-api-access-vqhxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.044997 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qhdnd" event={"ID":"2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8","Type":"ContainerDied","Data":"9a01baf9c11816ceec4cf994180e04b040fa71907abaabd6e99b2d25f487d90c"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.045045 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a01baf9c11816ceec4cf994180e04b040fa71907abaabd6e99b2d25f487d90c" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.045154 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qhdnd" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.051303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9n4lr" event={"ID":"b6d8c16c-799a-4946-80d8-5db5566e76c9","Type":"ContainerDied","Data":"5cf3103802bc0210cbb11eef24324920edb9c408a9b92b79ca1abb0f6047354d"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.051349 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9n4lr" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.051356 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cf3103802bc0210cbb11eef24324920edb9c408a9b92b79ca1abb0f6047354d" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.056016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c41-account-create-update-qsnfn" event={"ID":"9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb","Type":"ContainerDied","Data":"2e06bc4bdbcb0653442e8ab77db38d9c6ba94c2a48d393d9af89d93f1b10f988"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.056061 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e06bc4bdbcb0653442e8ab77db38d9c6ba94c2a48d393d9af89d93f1b10f988" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.056067 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c41-account-create-update-qsnfn" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.058445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" event={"ID":"f3477f12-9377-4146-a6a0-1cf5fa1c9dae","Type":"ContainerDied","Data":"c5b4cced64471507f017acda1956b6badafc2046862fb5d3a0471c0a48f16fd0"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.058479 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5b4cced64471507f017acda1956b6badafc2046862fb5d3a0471c0a48f16fd0" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.058533 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3a84-account-create-update-9plfj" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.062214 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" event={"ID":"85f3503b-9ffc-4667-b9f5-9a4880895948","Type":"ContainerDied","Data":"e69b069aa8b8c7cf8b4c7698b2eefe1005a8cc12a7b1da2cec47e5b71c9cc031"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.062251 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e69b069aa8b8c7cf8b4c7698b2eefe1005a8cc12a7b1da2cec47e5b71c9cc031" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.062313 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aea5-account-create-update-glsg4" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.071539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerStarted","Data":"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.071784 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-central-agent" containerID="cri-o://ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" gracePeriod=30 Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.071943 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.072248 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="proxy-httpd" containerID="cri-o://7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" gracePeriod=30 Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.072346 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="sg-core" containerID="cri-o://5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" gracePeriod=30 Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.072421 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-notification-agent" containerID="cri-o://91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" gracePeriod=30 Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.080965 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jt4qq" event={"ID":"405b0dbf-0f32-499a-a71f-9ed82f5eb6a9","Type":"ContainerDied","Data":"36187e7b1a8c327f019902a46a075f12f6955a7f57f629de23649e452d6ddbc0"} Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.081002 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jt4qq" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.081022 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36187e7b1a8c327f019902a46a075f12f6955a7f57f629de23649e452d6ddbc0" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109063 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqhxm\" (UniqueName: \"kubernetes.io/projected/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-kube-api-access-vqhxm\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109086 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6qnc\" (UniqueName: \"kubernetes.io/projected/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-kube-api-access-c6qnc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109096 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cqms\" (UniqueName: \"kubernetes.io/projected/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb-kube-api-access-5cqms\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109105 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gmdr\" (UniqueName: \"kubernetes.io/projected/b6d8c16c-799a-4946-80d8-5db5566e76c9-kube-api-access-9gmdr\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109115 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68l8\" (UniqueName: \"kubernetes.io/projected/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-kube-api-access-b68l8\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109135 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6d8c16c-799a-4946-80d8-5db5566e76c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109144 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3477f12-9377-4146-a6a0-1cf5fa1c9dae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109152 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.109161 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.113830 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.182306087 podStartE2EDuration="15.113808802s" podCreationTimestamp="2026-03-18 12:32:51 +0000 UTC" firstStartedPulling="2026-03-18 12:32:52.713570952 +0000 UTC m=+1392.263491591" lastFinishedPulling="2026-03-18 12:33:05.645073667 +0000 UTC m=+1405.194994306" observedRunningTime="2026-03-18 12:33:06.099256589 +0000 UTC m=+1405.649177248" watchObservedRunningTime="2026-03-18 12:33:06.113808802 +0000 UTC m=+1405.663729441" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.819256 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930148 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-combined-ca-bundle\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930292 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-sg-core-conf-yaml\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930393 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-run-httpd\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930450 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6vdl\" (UniqueName: \"kubernetes.io/projected/5249cc52-d843-4717-9182-85a5d617b94c-kube-api-access-m6vdl\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930500 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-config-data\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930570 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-log-httpd\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930602 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-scripts\") pod \"5249cc52-d843-4717-9182-85a5d617b94c\" (UID: \"5249cc52-d843-4717-9182-85a5d617b94c\") " Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.930912 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.931067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.936828 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-scripts" (OuterVolumeSpecName: "scripts") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.937259 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5249cc52-d843-4717-9182-85a5d617b94c-kube-api-access-m6vdl" (OuterVolumeSpecName: "kube-api-access-m6vdl") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "kube-api-access-m6vdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:06 crc kubenswrapper[4921]: I0318 12:33:06.962069 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.012528 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.027918 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-config-data" (OuterVolumeSpecName: "config-data") pod "5249cc52-d843-4717-9182-85a5d617b94c" (UID: "5249cc52-d843-4717-9182-85a5d617b94c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032522 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032569 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6vdl\" (UniqueName: \"kubernetes.io/projected/5249cc52-d843-4717-9182-85a5d617b94c-kube-api-access-m6vdl\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032586 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032600 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5249cc52-d843-4717-9182-85a5d617b94c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032612 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032623 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.032634 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5249cc52-d843-4717-9182-85a5d617b94c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095220 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095256 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerDied","Data":"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a"} Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095366 4921 scope.go:117] "RemoveContainer" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095185 4921 generic.go:334] "Generic (PLEG): container finished" podID="5249cc52-d843-4717-9182-85a5d617b94c" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" exitCode=0 Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095625 4921 generic.go:334] "Generic (PLEG): container finished" podID="5249cc52-d843-4717-9182-85a5d617b94c" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" exitCode=2 Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095636 4921 generic.go:334] "Generic (PLEG): container finished" podID="5249cc52-d843-4717-9182-85a5d617b94c" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" exitCode=0 Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095645 4921 generic.go:334] "Generic (PLEG): container finished" podID="5249cc52-d843-4717-9182-85a5d617b94c" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" exitCode=0 Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095664 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerDied","Data":"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c"} Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095682 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerDied","Data":"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96"} Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095695 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerDied","Data":"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba"} Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.095720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5249cc52-d843-4717-9182-85a5d617b94c","Type":"ContainerDied","Data":"cee963c66792f67bd4d8cccf723e350d7286d54257d715331ec92aa359db8fa4"} Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.137489 4921 scope.go:117] "RemoveContainer" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.154410 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.163975 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.175462 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.175941 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-central-agent" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.175961 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-central-agent" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.175979 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.175988 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176003 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="proxy-httpd" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176013 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="proxy-httpd" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176029 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f3503b-9ffc-4667-b9f5-9a4880895948" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176037 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f3503b-9ffc-4667-b9f5-9a4880895948" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176051 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d8c16c-799a-4946-80d8-5db5566e76c9" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176060 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d8c16c-799a-4946-80d8-5db5566e76c9" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176084 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-notification-agent" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176092 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-notification-agent" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176102 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3477f12-9377-4146-a6a0-1cf5fa1c9dae" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176115 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3477f12-9377-4146-a6a0-1cf5fa1c9dae" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176150 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176159 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176174 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176181 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.176204 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="sg-core" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176211 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="sg-core" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176421 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-central-agent" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176436 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176451 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="ceilometer-notification-agent" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176462 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="sg-core" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176477 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176495 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5249cc52-d843-4717-9182-85a5d617b94c" containerName="proxy-httpd" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176508 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f3503b-9ffc-4667-b9f5-9a4880895948" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176523 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d8c16c-799a-4946-80d8-5db5566e76c9" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176534 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3477f12-9377-4146-a6a0-1cf5fa1c9dae" containerName="mariadb-account-create-update" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.176547 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" containerName="mariadb-database-create" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.177729 4921 scope.go:117] "RemoveContainer" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.178375 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.182733 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.205942 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.206226 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.223581 4921 scope.go:117] "RemoveContainer" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.231700 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5249cc52-d843-4717-9182-85a5d617b94c" path="/var/lib/kubelet/pods/5249cc52-d843-4717-9182-85a5d617b94c/volumes" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.262478 4921 scope.go:117] "RemoveContainer" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.263030 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": container with ID starting with 7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a not found: ID does not exist" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263062 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a"} err="failed to get container status \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": rpc error: code = NotFound desc = could not find container \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": container with ID starting with 7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263081 4921 scope.go:117] "RemoveContainer" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.263341 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": container with ID starting with 5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c not found: ID does not exist" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263363 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c"} err="failed to get container status \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": rpc error: code = NotFound desc = could not find container \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": container with ID starting with 5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263377 4921 scope.go:117] "RemoveContainer" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.263544 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": container with ID starting with 91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96 not found: ID does not exist" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263562 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96"} err="failed to get container status \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": rpc error: code = NotFound desc = could not find container \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": container with ID starting with 91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96 not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263574 4921 scope.go:117] "RemoveContainer" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" Mar 18 12:33:07 crc kubenswrapper[4921]: E0318 12:33:07.263715 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": container with ID starting with ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba not found: ID does not exist" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263735 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba"} err="failed to get container status \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": rpc error: code = NotFound desc = could not find container \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": container with ID starting with ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263746 4921 scope.go:117] "RemoveContainer" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263882 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a"} err="failed to get container status \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": rpc error: code = NotFound desc = could not find container \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": container with ID starting with 7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.263898 4921 scope.go:117] "RemoveContainer" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264038 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c"} err="failed to get container status \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": rpc error: code = NotFound desc = could not find container \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": container with ID starting with 5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264054 4921 scope.go:117] "RemoveContainer" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264211 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96"} err="failed to get container status \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": rpc error: code = NotFound desc = could not find container \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": container with ID starting with 91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96 not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264226 4921 scope.go:117] "RemoveContainer" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264361 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba"} err="failed to get container status \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": rpc error: code = NotFound desc = could not find container \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": container with ID starting with ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264376 4921 scope.go:117] "RemoveContainer" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264510 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a"} err="failed to get container status \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": rpc error: code = NotFound desc = could not find container \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": container with ID starting with 7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264527 4921 scope.go:117] "RemoveContainer" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264691 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c"} err="failed to get container status \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": rpc error: code = NotFound desc = could not find container \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": container with ID starting with 5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264708 4921 scope.go:117] "RemoveContainer" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264854 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96"} err="failed to get container status \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": rpc error: code = NotFound desc = could not find container \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": container with ID starting with 91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96 not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.264869 4921 scope.go:117] "RemoveContainer" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265000 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba"} err="failed to get container status \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": rpc error: code = NotFound desc = could not find container \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": container with ID starting with ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265016 4921 scope.go:117] "RemoveContainer" containerID="7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265166 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a"} err="failed to get container status \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": rpc error: code = NotFound desc = could not find container \"7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a\": container with ID starting with 7b1028c21772b9e987669e4ca40ce18ae5fd40f48f0003d285aafbe8cfd10d1a not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265189 4921 scope.go:117] "RemoveContainer" containerID="5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265372 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c"} err="failed to get container status \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": rpc error: code = NotFound desc = could not find container \"5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c\": container with ID starting with 5083c13f61d2658ac1b28f3f767ac26b05d5373fb6cc7e0241644df6ec4a5f8c not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265395 4921 scope.go:117] "RemoveContainer" containerID="91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265721 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96"} err="failed to get container status \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": rpc error: code = NotFound desc = could not find container \"91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96\": container with ID starting with 91ab99d21c3588053cd75735305b0ddd58547b7b6f209d75b08497ff1dfdea96 not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265748 4921 scope.go:117] "RemoveContainer" containerID="ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.265998 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba"} err="failed to get container status \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": rpc error: code = NotFound desc = could not find container \"ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba\": container with ID starting with ed1d7d6f041f24a23a50a2c77c995e8c31b2e8d9ca2c12846f57403c3b760dba not found: ID does not exist" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.338258 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmjd9\" (UniqueName: \"kubernetes.io/projected/363d4bb1-d356-49bf-8926-8b86f4330108-kube-api-access-vmjd9\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.339018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-run-httpd\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.339244 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-log-httpd\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.339437 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-config-data\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.339562 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.339620 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-scripts\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.339715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441184 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmjd9\" (UniqueName: \"kubernetes.io/projected/363d4bb1-d356-49bf-8926-8b86f4330108-kube-api-access-vmjd9\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-run-httpd\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441321 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-log-httpd\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-config-data\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441424 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441457 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-scripts\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.441941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-run-httpd\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.442059 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-log-httpd\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.445961 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.446517 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.446552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-config-data\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.446526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-scripts\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.462922 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmjd9\" (UniqueName: \"kubernetes.io/projected/363d4bb1-d356-49bf-8926-8b86f4330108-kube-api-access-vmjd9\") pod \"ceilometer-0\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " pod="openstack/ceilometer-0" Mar 18 12:33:07 crc kubenswrapper[4921]: I0318 12:33:07.543139 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:08 crc kubenswrapper[4921]: I0318 12:33:08.054286 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:08 crc kubenswrapper[4921]: I0318 12:33:08.104496 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerStarted","Data":"b3af01fb620c3340c28aef4a2127aa9d9a6075a029dc7153b14df06a2f5e2c00"} Mar 18 12:33:09 crc kubenswrapper[4921]: I0318 12:33:09.115091 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerStarted","Data":"60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91"} Mar 18 12:33:09 crc kubenswrapper[4921]: I0318 12:33:09.441011 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.169784 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfzsh"] Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.172233 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.178612 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.178835 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2zk95" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.179293 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.189261 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfzsh"] Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.204918 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerStarted","Data":"bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6"} Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.287483 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.287908 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-scripts\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.288067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8npc\" (UniqueName: \"kubernetes.io/projected/ff14077c-4ea8-4a45-925c-0f83af75745c-kube-api-access-b8npc\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.288214 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-config-data\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.389801 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8npc\" (UniqueName: \"kubernetes.io/projected/ff14077c-4ea8-4a45-925c-0f83af75745c-kube-api-access-b8npc\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.389982 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-config-data\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.390059 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.390184 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-scripts\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.396347 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.396528 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-scripts\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.414923 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-config-data\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.423087 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8npc\" (UniqueName: \"kubernetes.io/projected/ff14077c-4ea8-4a45-925c-0f83af75745c-kube-api-access-b8npc\") pod \"nova-cell0-conductor-db-sync-rfzsh\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:10 crc kubenswrapper[4921]: I0318 12:33:10.506425 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:11 crc kubenswrapper[4921]: I0318 12:33:11.058596 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfzsh"] Mar 18 12:33:11 crc kubenswrapper[4921]: I0318 12:33:11.223642 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" event={"ID":"ff14077c-4ea8-4a45-925c-0f83af75745c","Type":"ContainerStarted","Data":"ac77c2ee48d52023ee648fd792af7f69eec9b76e3cfb1225e697664fb05ad8cb"} Mar 18 12:33:11 crc kubenswrapper[4921]: I0318 12:33:11.228396 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerStarted","Data":"37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338"} Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.522399 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.522784 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.574492 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.582976 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.711026 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.711111 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.759549 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:33:12 crc kubenswrapper[4921]: I0318 12:33:12.765316 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.251411 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerStarted","Data":"a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac"} Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.257335 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.257378 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.257394 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.257404 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.251571 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-central-agent" containerID="cri-o://60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91" gracePeriod=30 Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.251771 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="sg-core" containerID="cri-o://37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338" gracePeriod=30 Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.251758 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="proxy-httpd" containerID="cri-o://a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac" gracePeriod=30 Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.251781 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-notification-agent" containerID="cri-o://bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6" gracePeriod=30 Mar 18 12:33:13 crc kubenswrapper[4921]: I0318 12:33:13.275527 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.410434795 podStartE2EDuration="6.275500393s" podCreationTimestamp="2026-03-18 12:33:07 +0000 UTC" firstStartedPulling="2026-03-18 12:33:08.061071194 +0000 UTC m=+1407.610991823" lastFinishedPulling="2026-03-18 12:33:11.926136782 +0000 UTC m=+1411.476057421" observedRunningTime="2026-03-18 12:33:13.270867062 +0000 UTC m=+1412.820787701" watchObservedRunningTime="2026-03-18 12:33:13.275500393 +0000 UTC m=+1412.825421032" Mar 18 12:33:14 crc kubenswrapper[4921]: I0318 12:33:14.294886 4921 generic.go:334] "Generic (PLEG): container finished" podID="363d4bb1-d356-49bf-8926-8b86f4330108" containerID="a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac" exitCode=0 Mar 18 12:33:14 crc kubenswrapper[4921]: I0318 12:33:14.294928 4921 generic.go:334] "Generic (PLEG): container finished" podID="363d4bb1-d356-49bf-8926-8b86f4330108" containerID="37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338" exitCode=2 Mar 18 12:33:14 crc kubenswrapper[4921]: I0318 12:33:14.294943 4921 generic.go:334] "Generic (PLEG): container finished" podID="363d4bb1-d356-49bf-8926-8b86f4330108" containerID="bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6" exitCode=0 Mar 18 12:33:14 crc kubenswrapper[4921]: I0318 12:33:14.294960 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerDied","Data":"a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac"} Mar 18 12:33:14 crc kubenswrapper[4921]: I0318 12:33:14.295013 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerDied","Data":"37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338"} Mar 18 12:33:14 crc kubenswrapper[4921]: I0318 12:33:14.295031 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerDied","Data":"bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6"} Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.326863 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.326900 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.327229 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.327250 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.364072 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.411322 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.424598 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:15 crc kubenswrapper[4921]: I0318 12:33:15.700599 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.383371 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" event={"ID":"ff14077c-4ea8-4a45-925c-0f83af75745c","Type":"ContainerStarted","Data":"1162424b5b68d6c7d3d3885c3c665ba5e9a0ab486bfe59bb8e8945864846a899"} Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.409840 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" podStartSLOduration=1.93696359 podStartE2EDuration="10.409808107s" podCreationTimestamp="2026-03-18 12:33:10 +0000 UTC" firstStartedPulling="2026-03-18 12:33:11.049601942 +0000 UTC m=+1410.599522581" lastFinishedPulling="2026-03-18 12:33:19.522446459 +0000 UTC m=+1419.072367098" observedRunningTime="2026-03-18 12:33:20.398956349 +0000 UTC m=+1419.948877048" watchObservedRunningTime="2026-03-18 12:33:20.409808107 +0000 UTC m=+1419.959728766" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.835991 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.904957 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmjd9\" (UniqueName: \"kubernetes.io/projected/363d4bb1-d356-49bf-8926-8b86f4330108-kube-api-access-vmjd9\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.905064 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-config-data\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.905250 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-log-httpd\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.905343 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-scripts\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.905373 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-combined-ca-bundle\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.905417 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-run-httpd\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.905596 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-sg-core-conf-yaml\") pod \"363d4bb1-d356-49bf-8926-8b86f4330108\" (UID: \"363d4bb1-d356-49bf-8926-8b86f4330108\") " Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.906322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.906488 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.911278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363d4bb1-d356-49bf-8926-8b86f4330108-kube-api-access-vmjd9" (OuterVolumeSpecName: "kube-api-access-vmjd9") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "kube-api-access-vmjd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.921287 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-scripts" (OuterVolumeSpecName: "scripts") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.940801 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:20 crc kubenswrapper[4921]: I0318 12:33:20.986329 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.008305 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmjd9\" (UniqueName: \"kubernetes.io/projected/363d4bb1-d356-49bf-8926-8b86f4330108-kube-api-access-vmjd9\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.008343 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.008356 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.008368 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.008380 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/363d4bb1-d356-49bf-8926-8b86f4330108-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.008391 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.020563 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-config-data" (OuterVolumeSpecName: "config-data") pod "363d4bb1-d356-49bf-8926-8b86f4330108" (UID: "363d4bb1-d356-49bf-8926-8b86f4330108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.112273 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/363d4bb1-d356-49bf-8926-8b86f4330108-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.394741 4921 generic.go:334] "Generic (PLEG): container finished" podID="363d4bb1-d356-49bf-8926-8b86f4330108" containerID="60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91" exitCode=0 Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.394795 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.394814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerDied","Data":"60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91"} Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.395247 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"363d4bb1-d356-49bf-8926-8b86f4330108","Type":"ContainerDied","Data":"b3af01fb620c3340c28aef4a2127aa9d9a6075a029dc7153b14df06a2f5e2c00"} Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.395267 4921 scope.go:117] "RemoveContainer" containerID="a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.417017 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.425335 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.434447 4921 scope.go:117] "RemoveContainer" containerID="37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448153 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.448616 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="proxy-httpd" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448641 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="proxy-httpd" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.448664 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-central-agent" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448673 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-central-agent" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.448698 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-notification-agent" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448705 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-notification-agent" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.448721 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="sg-core" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448727 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="sg-core" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448900 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-notification-agent" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448916 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="proxy-httpd" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448927 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="ceilometer-central-agent" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.448936 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" containerName="sg-core" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.450852 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.455659 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.455670 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.462880 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.480417 4921 scope.go:117] "RemoveContainer" containerID="bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.507452 4921 scope.go:117] "RemoveContainer" containerID="60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517570 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-scripts\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517657 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-run-httpd\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvdx\" (UniqueName: \"kubernetes.io/projected/d71a3136-a7d4-4d07-a5c4-9402067769df-kube-api-access-9fvdx\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-log-httpd\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517823 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517842 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-config-data\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.517885 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.527889 4921 scope.go:117] "RemoveContainer" containerID="a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.528677 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac\": container with ID starting with a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac not found: ID does not exist" containerID="a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.528730 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac"} err="failed to get container status \"a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac\": rpc error: code = NotFound desc = could not find container \"a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac\": container with ID starting with a66808b4d68d64767bb40397baf515440387b638d1ebe06a46d3b3d4829b57ac not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.528764 4921 scope.go:117] "RemoveContainer" containerID="37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.529086 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338\": container with ID starting with 37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338 not found: ID does not exist" containerID="37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.529183 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338"} err="failed to get container status \"37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338\": rpc error: code = NotFound desc = could not find container \"37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338\": container with ID starting with 37f20d312cc1625f6962b521149ceb8e7d70cb5c49d8e7e110fd45e741583338 not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.529201 4921 scope.go:117] "RemoveContainer" containerID="bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.529531 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6\": container with ID starting with bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6 not found: ID does not exist" containerID="bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.529560 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6"} err="failed to get container status \"bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6\": rpc error: code = NotFound desc = could not find container \"bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6\": container with ID starting with bf910afe297d6da6a41afcfa3bea7f28e2047665de077d32f4d2ec4fb5b560b6 not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.529577 4921 scope.go:117] "RemoveContainer" containerID="60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91" Mar 18 12:33:21 crc kubenswrapper[4921]: E0318 12:33:21.529819 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91\": container with ID starting with 60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91 not found: ID does not exist" containerID="60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.529845 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91"} err="failed to get container status \"60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91\": rpc error: code = NotFound desc = could not find container \"60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91\": container with ID starting with 60510975c0dacf16da91b9d6267639d1e246b0467d45ee630f77b2aae418fd91 not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619021 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvdx\" (UniqueName: \"kubernetes.io/projected/d71a3136-a7d4-4d07-a5c4-9402067769df-kube-api-access-9fvdx\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619168 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-log-httpd\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619238 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619268 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-config-data\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619311 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-scripts\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619407 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-run-httpd\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619713 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-log-httpd\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.619897 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-run-httpd\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.624767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.625031 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.625137 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-scripts\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.632360 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-config-data\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.636995 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvdx\" (UniqueName: \"kubernetes.io/projected/d71a3136-a7d4-4d07-a5c4-9402067769df-kube-api-access-9fvdx\") pod \"ceilometer-0\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " pod="openstack/ceilometer-0" Mar 18 12:33:21 crc kubenswrapper[4921]: I0318 12:33:21.786128 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:22 crc kubenswrapper[4921]: I0318 12:33:22.255681 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:22 crc kubenswrapper[4921]: I0318 12:33:22.406103 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerStarted","Data":"3de5147135bea5c2acc505eca8d68758da3f6eac8d70983890f830172dedd32c"} Mar 18 12:33:23 crc kubenswrapper[4921]: I0318 12:33:23.226800 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="363d4bb1-d356-49bf-8926-8b86f4330108" path="/var/lib/kubelet/pods/363d4bb1-d356-49bf-8926-8b86f4330108/volumes" Mar 18 12:33:23 crc kubenswrapper[4921]: I0318 12:33:23.417797 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerStarted","Data":"c463d1163986dacd62027f7a4fd86b15ac78f4daa815c32afba033513d014aa0"} Mar 18 12:33:24 crc kubenswrapper[4921]: I0318 12:33:24.435874 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerStarted","Data":"ba68603e791c45d0a1b0ab162bd8855cc5016b05a9972616614aab1fbf9cae57"} Mar 18 12:33:24 crc kubenswrapper[4921]: I0318 12:33:24.436674 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerStarted","Data":"63770d95e7f6d985b6c45e2a172e625c0a9bbed0869b041ac2dcf6720b35f5d9"} Mar 18 12:33:26 crc kubenswrapper[4921]: I0318 12:33:26.459054 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerStarted","Data":"45c3c2943b91b00a92386dfcb47fa53e8fed1169bd430b583251389bd2711cd8"} Mar 18 12:33:26 crc kubenswrapper[4921]: I0318 12:33:26.459592 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:33:26 crc kubenswrapper[4921]: I0318 12:33:26.492929 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.861150817 podStartE2EDuration="5.492892822s" podCreationTimestamp="2026-03-18 12:33:21 +0000 UTC" firstStartedPulling="2026-03-18 12:33:22.254680552 +0000 UTC m=+1421.804601191" lastFinishedPulling="2026-03-18 12:33:25.886422557 +0000 UTC m=+1425.436343196" observedRunningTime="2026-03-18 12:33:26.486516601 +0000 UTC m=+1426.036437250" watchObservedRunningTime="2026-03-18 12:33:26.492892822 +0000 UTC m=+1426.042813461" Mar 18 12:33:28 crc kubenswrapper[4921]: I0318 12:33:28.170803 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:28 crc kubenswrapper[4921]: I0318 12:33:28.475813 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-central-agent" containerID="cri-o://c463d1163986dacd62027f7a4fd86b15ac78f4daa815c32afba033513d014aa0" gracePeriod=30 Mar 18 12:33:28 crc kubenswrapper[4921]: I0318 12:33:28.475888 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="sg-core" containerID="cri-o://ba68603e791c45d0a1b0ab162bd8855cc5016b05a9972616614aab1fbf9cae57" gracePeriod=30 Mar 18 12:33:28 crc kubenswrapper[4921]: I0318 12:33:28.475905 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-notification-agent" containerID="cri-o://63770d95e7f6d985b6c45e2a172e625c0a9bbed0869b041ac2dcf6720b35f5d9" gracePeriod=30 Mar 18 12:33:28 crc kubenswrapper[4921]: I0318 12:33:28.475983 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="proxy-httpd" containerID="cri-o://45c3c2943b91b00a92386dfcb47fa53e8fed1169bd430b583251389bd2711cd8" gracePeriod=30 Mar 18 12:33:29 crc kubenswrapper[4921]: I0318 12:33:29.486971 4921 generic.go:334] "Generic (PLEG): container finished" podID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerID="45c3c2943b91b00a92386dfcb47fa53e8fed1169bd430b583251389bd2711cd8" exitCode=0 Mar 18 12:33:29 crc kubenswrapper[4921]: I0318 12:33:29.487008 4921 generic.go:334] "Generic (PLEG): container finished" podID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerID="ba68603e791c45d0a1b0ab162bd8855cc5016b05a9972616614aab1fbf9cae57" exitCode=2 Mar 18 12:33:29 crc kubenswrapper[4921]: I0318 12:33:29.487017 4921 generic.go:334] "Generic (PLEG): container finished" podID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerID="63770d95e7f6d985b6c45e2a172e625c0a9bbed0869b041ac2dcf6720b35f5d9" exitCode=0 Mar 18 12:33:29 crc kubenswrapper[4921]: I0318 12:33:29.487036 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerDied","Data":"45c3c2943b91b00a92386dfcb47fa53e8fed1169bd430b583251389bd2711cd8"} Mar 18 12:33:29 crc kubenswrapper[4921]: I0318 12:33:29.487065 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerDied","Data":"ba68603e791c45d0a1b0ab162bd8855cc5016b05a9972616614aab1fbf9cae57"} Mar 18 12:33:29 crc kubenswrapper[4921]: I0318 12:33:29.487085 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerDied","Data":"63770d95e7f6d985b6c45e2a172e625c0a9bbed0869b041ac2dcf6720b35f5d9"} Mar 18 12:33:30 crc kubenswrapper[4921]: I0318 12:33:30.481173 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:33:30 crc kubenswrapper[4921]: I0318 12:33:30.481225 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="879edc6c-5a15-4316-9f8f-58bcf8d87b95" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:33:31 crc kubenswrapper[4921]: I0318 12:33:31.524980 4921 generic.go:334] "Generic (PLEG): container finished" podID="ff14077c-4ea8-4a45-925c-0f83af75745c" containerID="1162424b5b68d6c7d3d3885c3c665ba5e9a0ab486bfe59bb8e8945864846a899" exitCode=0 Mar 18 12:33:31 crc kubenswrapper[4921]: I0318 12:33:31.524996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" event={"ID":"ff14077c-4ea8-4a45-925c-0f83af75745c","Type":"ContainerDied","Data":"1162424b5b68d6c7d3d3885c3c665ba5e9a0ab486bfe59bb8e8945864846a899"} Mar 18 12:33:32 crc kubenswrapper[4921]: I0318 12:33:32.856327 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:32 crc kubenswrapper[4921]: I0318 12:33:32.981285 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8npc\" (UniqueName: \"kubernetes.io/projected/ff14077c-4ea8-4a45-925c-0f83af75745c-kube-api-access-b8npc\") pod \"ff14077c-4ea8-4a45-925c-0f83af75745c\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " Mar 18 12:33:32 crc kubenswrapper[4921]: I0318 12:33:32.981371 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-combined-ca-bundle\") pod \"ff14077c-4ea8-4a45-925c-0f83af75745c\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " Mar 18 12:33:32 crc kubenswrapper[4921]: I0318 12:33:32.981540 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-config-data\") pod \"ff14077c-4ea8-4a45-925c-0f83af75745c\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " Mar 18 12:33:32 crc kubenswrapper[4921]: I0318 12:33:32.981695 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-scripts\") pod \"ff14077c-4ea8-4a45-925c-0f83af75745c\" (UID: \"ff14077c-4ea8-4a45-925c-0f83af75745c\") " Mar 18 12:33:32 crc kubenswrapper[4921]: I0318 12:33:32.993314 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-scripts" (OuterVolumeSpecName: "scripts") pod "ff14077c-4ea8-4a45-925c-0f83af75745c" (UID: "ff14077c-4ea8-4a45-925c-0f83af75745c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.006949 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff14077c-4ea8-4a45-925c-0f83af75745c-kube-api-access-b8npc" (OuterVolumeSpecName: "kube-api-access-b8npc") pod "ff14077c-4ea8-4a45-925c-0f83af75745c" (UID: "ff14077c-4ea8-4a45-925c-0f83af75745c"). InnerVolumeSpecName "kube-api-access-b8npc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.015929 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff14077c-4ea8-4a45-925c-0f83af75745c" (UID: "ff14077c-4ea8-4a45-925c-0f83af75745c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.017341 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-config-data" (OuterVolumeSpecName: "config-data") pod "ff14077c-4ea8-4a45-925c-0f83af75745c" (UID: "ff14077c-4ea8-4a45-925c-0f83af75745c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.084276 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.084323 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.084338 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8npc\" (UniqueName: \"kubernetes.io/projected/ff14077c-4ea8-4a45-925c-0f83af75745c-kube-api-access-b8npc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.084352 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff14077c-4ea8-4a45-925c-0f83af75745c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.550366 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" event={"ID":"ff14077c-4ea8-4a45-925c-0f83af75745c","Type":"ContainerDied","Data":"ac77c2ee48d52023ee648fd792af7f69eec9b76e3cfb1225e697664fb05ad8cb"} Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.550422 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac77c2ee48d52023ee648fd792af7f69eec9b76e3cfb1225e697664fb05ad8cb" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.550441 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfzsh" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.553622 4921 generic.go:334] "Generic (PLEG): container finished" podID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerID="c463d1163986dacd62027f7a4fd86b15ac78f4daa815c32afba033513d014aa0" exitCode=0 Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.553686 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerDied","Data":"c463d1163986dacd62027f7a4fd86b15ac78f4daa815c32afba033513d014aa0"} Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.728459 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:33:33 crc kubenswrapper[4921]: E0318 12:33:33.729792 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff14077c-4ea8-4a45-925c-0f83af75745c" containerName="nova-cell0-conductor-db-sync" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.729814 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff14077c-4ea8-4a45-925c-0f83af75745c" containerName="nova-cell0-conductor-db-sync" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.730036 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff14077c-4ea8-4a45-925c-0f83af75745c" containerName="nova-cell0-conductor-db-sync" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.730715 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.735952 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2zk95" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.736669 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.748306 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.750861 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.896929 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-scripts\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.897481 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-log-httpd\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.897559 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-sg-core-conf-yaml\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.897594 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-run-httpd\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.897647 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvdx\" (UniqueName: \"kubernetes.io/projected/d71a3136-a7d4-4d07-a5c4-9402067769df-kube-api-access-9fvdx\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.898063 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.898217 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.898261 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-config-data\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.898647 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-combined-ca-bundle\") pod \"d71a3136-a7d4-4d07-a5c4-9402067769df\" (UID: \"d71a3136-a7d4-4d07-a5c4-9402067769df\") " Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.899451 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.899844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.900042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4jf\" (UniqueName: \"kubernetes.io/projected/f4f8bb3e-7a43-4a4d-8012-143658c681fc-kube-api-access-qb4jf\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.900720 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.900867 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71a3136-a7d4-4d07-a5c4-9402067769df-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.905516 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-scripts" (OuterVolumeSpecName: "scripts") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.905549 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71a3136-a7d4-4d07-a5c4-9402067769df-kube-api-access-9fvdx" (OuterVolumeSpecName: "kube-api-access-9fvdx") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "kube-api-access-9fvdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.932406 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:33 crc kubenswrapper[4921]: I0318 12:33:33.965746 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.000297 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-config-data" (OuterVolumeSpecName: "config-data") pod "d71a3136-a7d4-4d07-a5c4-9402067769df" (UID: "d71a3136-a7d4-4d07-a5c4-9402067769df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.002204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4jf\" (UniqueName: \"kubernetes.io/projected/f4f8bb3e-7a43-4a4d-8012-143658c681fc-kube-api-access-qb4jf\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.002342 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.002402 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.003033 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.003275 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvdx\" (UniqueName: \"kubernetes.io/projected/d71a3136-a7d4-4d07-a5c4-9402067769df-kube-api-access-9fvdx\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.003292 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.003303 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.003314 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a3136-a7d4-4d07-a5c4-9402067769df-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.007606 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.016745 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.021779 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4jf\" (UniqueName: \"kubernetes.io/projected/f4f8bb3e-7a43-4a4d-8012-143658c681fc-kube-api-access-qb4jf\") pod \"nova-cell0-conductor-0\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.117000 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.566474 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.568806 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71a3136-a7d4-4d07-a5c4-9402067769df","Type":"ContainerDied","Data":"3de5147135bea5c2acc505eca8d68758da3f6eac8d70983890f830172dedd32c"} Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.568868 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.568883 4921 scope.go:117] "RemoveContainer" containerID="45c3c2943b91b00a92386dfcb47fa53e8fed1169bd430b583251389bd2711cd8" Mar 18 12:33:34 crc kubenswrapper[4921]: W0318 12:33:34.572406 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f8bb3e_7a43_4a4d_8012_143658c681fc.slice/crio-d9fb53a4ac35892c34734ee91b218f6b12a7828255b417f284badbbfa5f0e628 WatchSource:0}: Error finding container d9fb53a4ac35892c34734ee91b218f6b12a7828255b417f284badbbfa5f0e628: Status 404 returned error can't find the container with id d9fb53a4ac35892c34734ee91b218f6b12a7828255b417f284badbbfa5f0e628 Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.606424 4921 scope.go:117] "RemoveContainer" containerID="ba68603e791c45d0a1b0ab162bd8855cc5016b05a9972616614aab1fbf9cae57" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.612417 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.638654 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.662850 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:34 crc kubenswrapper[4921]: E0318 12:33:34.663525 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-central-agent" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.663631 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-central-agent" Mar 18 12:33:34 crc kubenswrapper[4921]: E0318 12:33:34.663722 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="sg-core" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.663803 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="sg-core" Mar 18 12:33:34 crc kubenswrapper[4921]: E0318 12:33:34.663875 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="proxy-httpd" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.663938 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="proxy-httpd" Mar 18 12:33:34 crc kubenswrapper[4921]: E0318 12:33:34.664022 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-notification-agent" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.664082 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-notification-agent" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.664433 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-notification-agent" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.664519 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="ceilometer-central-agent" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.664613 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="proxy-httpd" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.664695 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" containerName="sg-core" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.667491 4921 scope.go:117] "RemoveContainer" containerID="63770d95e7f6d985b6c45e2a172e625c0a9bbed0869b041ac2dcf6720b35f5d9" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.667951 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.670609 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.670752 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.673087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.702666 4921 scope.go:117] "RemoveContainer" containerID="c463d1163986dacd62027f7a4fd86b15ac78f4daa815c32afba033513d014aa0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.819370 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-run-httpd\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.819449 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxtw\" (UniqueName: \"kubernetes.io/projected/d4ec8a1f-95dc-4169-aee6-899cbb6db594-kube-api-access-fsxtw\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.819634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-config-data\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.819817 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.820009 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-scripts\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.820121 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-log-httpd\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.820165 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.921926 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-config-data\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.922042 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.922133 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-scripts\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.922175 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-log-httpd\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.922198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.922264 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-run-httpd\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.922295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxtw\" (UniqueName: \"kubernetes.io/projected/d4ec8a1f-95dc-4169-aee6-899cbb6db594-kube-api-access-fsxtw\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.923529 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-log-httpd\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.923566 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-run-httpd\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.931021 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.931363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-scripts\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.931363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.934823 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-config-data\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.942924 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxtw\" (UniqueName: \"kubernetes.io/projected/d4ec8a1f-95dc-4169-aee6-899cbb6db594-kube-api-access-fsxtw\") pod \"ceilometer-0\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " pod="openstack/ceilometer-0" Mar 18 12:33:34 crc kubenswrapper[4921]: I0318 12:33:34.993226 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.224664 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71a3136-a7d4-4d07-a5c4-9402067769df" path="/var/lib/kubelet/pods/d71a3136-a7d4-4d07-a5c4-9402067769df/volumes" Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.451060 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:35 crc kubenswrapper[4921]: W0318 12:33:35.456544 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ec8a1f_95dc_4169_aee6_899cbb6db594.slice/crio-49f5145acc314267b8b50e1cb96b1b74ff1de97a13f7c5e4b609e994fbab4f20 WatchSource:0}: Error finding container 49f5145acc314267b8b50e1cb96b1b74ff1de97a13f7c5e4b609e994fbab4f20: Status 404 returned error can't find the container with id 49f5145acc314267b8b50e1cb96b1b74ff1de97a13f7c5e4b609e994fbab4f20 Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.579990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerStarted","Data":"49f5145acc314267b8b50e1cb96b1b74ff1de97a13f7c5e4b609e994fbab4f20"} Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.581296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f4f8bb3e-7a43-4a4d-8012-143658c681fc","Type":"ContainerStarted","Data":"4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec"} Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.581323 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f4f8bb3e-7a43-4a4d-8012-143658c681fc","Type":"ContainerStarted","Data":"d9fb53a4ac35892c34734ee91b218f6b12a7828255b417f284badbbfa5f0e628"} Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.582324 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:35 crc kubenswrapper[4921]: I0318 12:33:35.603949 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.6039304530000003 podStartE2EDuration="2.603930453s" podCreationTimestamp="2026-03-18 12:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:35.598516789 +0000 UTC m=+1435.148437438" watchObservedRunningTime="2026-03-18 12:33:35.603930453 +0000 UTC m=+1435.153851092" Mar 18 12:33:36 crc kubenswrapper[4921]: I0318 12:33:36.594566 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerStarted","Data":"ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b"} Mar 18 12:33:37 crc kubenswrapper[4921]: I0318 12:33:37.607032 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerStarted","Data":"47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8"} Mar 18 12:33:38 crc kubenswrapper[4921]: I0318 12:33:38.621230 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerStarted","Data":"b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a"} Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.160858 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.641692 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jn2s5"] Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.643079 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.645823 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.645909 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.661623 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jn2s5"] Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.821144 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-config-data\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.821743 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-scripts\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.821827 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.821917 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fpqd\" (UniqueName: \"kubernetes.io/projected/17db594b-8493-4668-ab55-ed7c9f41db14-kube-api-access-5fpqd\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.861049 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.862205 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.865473 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.878136 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924828 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-config-data\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924845 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-scripts\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924881 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbrp\" (UniqueName: \"kubernetes.io/projected/08118125-56d8-489c-83fb-d54c86aff1d4-kube-api-access-dhbrp\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.924942 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fpqd\" (UniqueName: \"kubernetes.io/projected/17db594b-8493-4668-ab55-ed7c9f41db14-kube-api-access-5fpqd\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.929648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-config-data\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.930939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-scripts\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.934523 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.965821 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fpqd\" (UniqueName: \"kubernetes.io/projected/17db594b-8493-4668-ab55-ed7c9f41db14-kube-api-access-5fpqd\") pod \"nova-cell0-cell-mapping-jn2s5\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.976639 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.978618 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.979597 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.980487 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:33:39 crc kubenswrapper[4921]: I0318 12:33:39.981033 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026669 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbrp\" (UniqueName: \"kubernetes.io/projected/08118125-56d8-489c-83fb-d54c86aff1d4-kube-api-access-dhbrp\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026730 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-config-data\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026783 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026799 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026847 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-logs\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.026908 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq4g\" (UniqueName: \"kubernetes.io/projected/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-kube-api-access-tjq4g\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.069241 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.069537 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.072174 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.074035 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.083074 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.119684 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbrp\" (UniqueName: \"kubernetes.io/projected/08118125-56d8-489c-83fb-d54c86aff1d4-kube-api-access-dhbrp\") pod \"nova-cell1-novncproxy-0\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.128161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-config-data\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.128278 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.128315 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-logs\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.128365 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq4g\" (UniqueName: \"kubernetes.io/projected/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-kube-api-access-tjq4g\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.132627 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-logs\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.137794 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.142998 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.154222 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-config-data\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.163385 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq4g\" (UniqueName: \"kubernetes.io/projected/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-kube-api-access-tjq4g\") pod \"nova-metadata-0\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.201476 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.203556 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.214323 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.223331 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.231986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-config-data\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.232043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.232132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56dcf122-202d-4ead-9bf0-9dd079a704d0-logs\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.232195 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68rfs\" (UniqueName: \"kubernetes.io/projected/56dcf122-202d-4ead-9bf0-9dd079a704d0-kube-api-access-68rfs\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.246397 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hfntw"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.248504 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.277400 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hfntw"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.299524 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338295 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-config-data\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338340 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68rfs\" (UniqueName: \"kubernetes.io/projected/56dcf122-202d-4ead-9bf0-9dd079a704d0-kube-api-access-68rfs\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338437 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-config-data\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338496 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338521 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338559 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-svc\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338594 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vglnt\" (UniqueName: \"kubernetes.io/projected/71615a6c-de34-4a90-a680-f916d9813518-kube-api-access-vglnt\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338615 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvz7\" (UniqueName: \"kubernetes.io/projected/17799c79-7099-46dd-baf5-444063789a7d-kube-api-access-9mvz7\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338639 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-config\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338665 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338681 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338700 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56dcf122-202d-4ead-9bf0-9dd079a704d0-logs\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.338720 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.355146 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56dcf122-202d-4ead-9bf0-9dd079a704d0-logs\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.355842 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-config-data\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.359758 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.372529 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68rfs\" (UniqueName: \"kubernetes.io/projected/56dcf122-202d-4ead-9bf0-9dd079a704d0-kube-api-access-68rfs\") pod \"nova-api-0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.403459 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442324 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-svc\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442447 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vglnt\" (UniqueName: \"kubernetes.io/projected/71615a6c-de34-4a90-a680-f916d9813518-kube-api-access-vglnt\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvz7\" (UniqueName: \"kubernetes.io/projected/17799c79-7099-46dd-baf5-444063789a7d-kube-api-access-9mvz7\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442511 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-config\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442558 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442579 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442616 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.442730 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-config-data\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.443267 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-svc\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.443685 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.444263 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.444658 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.444955 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-config\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.448042 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.453715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-config-data\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.469673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvz7\" (UniqueName: \"kubernetes.io/projected/17799c79-7099-46dd-baf5-444063789a7d-kube-api-access-9mvz7\") pod \"nova-scheduler-0\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.470970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vglnt\" (UniqueName: \"kubernetes.io/projected/71615a6c-de34-4a90-a680-f916d9813518-kube-api-access-vglnt\") pod \"dnsmasq-dns-757b4f8459-hfntw\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.599585 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.631191 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.658468 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.669277 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerStarted","Data":"4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7"} Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.669524 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.757545 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.434492049 podStartE2EDuration="6.757525126s" podCreationTimestamp="2026-03-18 12:33:34 +0000 UTC" firstStartedPulling="2026-03-18 12:33:35.459614257 +0000 UTC m=+1435.009534896" lastFinishedPulling="2026-03-18 12:33:39.782647334 +0000 UTC m=+1439.332567973" observedRunningTime="2026-03-18 12:33:40.697416129 +0000 UTC m=+1440.247336768" watchObservedRunningTime="2026-03-18 12:33:40.757525126 +0000 UTC m=+1440.307445765" Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.766691 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jn2s5"] Mar 18 12:33:40 crc kubenswrapper[4921]: I0318 12:33:40.912556 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.035150 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68h69"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.037782 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.042517 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.042704 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.048157 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68h69"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.106801 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.158712 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-config-data\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.158858 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.158893 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk58j\" (UniqueName: \"kubernetes.io/projected/97f16f79-7171-482d-b31a-0a204980fbf6-kube-api-access-nk58j\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.158915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-scripts\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.263517 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.264096 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk58j\" (UniqueName: \"kubernetes.io/projected/97f16f79-7171-482d-b31a-0a204980fbf6-kube-api-access-nk58j\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.264157 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-scripts\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.267047 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-config-data\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.268497 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.269872 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.286834 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-scripts\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.294994 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk58j\" (UniqueName: \"kubernetes.io/projected/97f16f79-7171-482d-b31a-0a204980fbf6-kube-api-access-nk58j\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.303944 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.316268 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.336801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-config-data\") pod \"nova-cell1-conductor-db-sync-68h69\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.371560 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.450019 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.594419 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hfntw"] Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.707262 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d5ab6ab-d300-4365-86c6-e1a782e6ee94","Type":"ContainerStarted","Data":"fe54dae3335eba78186320f8806c38f05430c01c9efe206486e163149459ce38"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.717836 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17799c79-7099-46dd-baf5-444063789a7d","Type":"ContainerStarted","Data":"5e123e52b3c5393a9ccb048b038ab48d1726d02f536f6dcf5fc16ffe8c876454"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.719788 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" event={"ID":"71615a6c-de34-4a90-a680-f916d9813518","Type":"ContainerStarted","Data":"560c54b560ee8c27332e6c76d407af2af64008fe9289d66db6e7d03618753249"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.727770 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jn2s5" event={"ID":"17db594b-8493-4668-ab55-ed7c9f41db14","Type":"ContainerStarted","Data":"9632959f08bc3cd646cbe6c91fa73851a145e8b01415f795c58c8ee45b1d10bb"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.727832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jn2s5" event={"ID":"17db594b-8493-4668-ab55-ed7c9f41db14","Type":"ContainerStarted","Data":"6e02c617c9b023cbd2d1119d4c915dfe2ca82bbf6f358687f4527960fa730f0d"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.732786 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08118125-56d8-489c-83fb-d54c86aff1d4","Type":"ContainerStarted","Data":"d07ff2b8cd067f8f06830c2babbe72745c24d0288f80ed4d73d3db1014a28d5d"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.748031 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56dcf122-202d-4ead-9bf0-9dd079a704d0","Type":"ContainerStarted","Data":"f75a3d242d0efd233377ee105c20e456017cd611da35b7152981cb37077b9e7b"} Mar 18 12:33:41 crc kubenswrapper[4921]: I0318 12:33:41.758341 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jn2s5" podStartSLOduration=2.758318733 podStartE2EDuration="2.758318733s" podCreationTimestamp="2026-03-18 12:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:41.753692522 +0000 UTC m=+1441.303613171" watchObservedRunningTime="2026-03-18 12:33:41.758318733 +0000 UTC m=+1441.308239372" Mar 18 12:33:42 crc kubenswrapper[4921]: I0318 12:33:42.095571 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68h69"] Mar 18 12:33:42 crc kubenswrapper[4921]: E0318 12:33:42.249867 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71615a6c_de34_4a90_a680_f916d9813518.slice/crio-conmon-1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71615a6c_de34_4a90_a680_f916d9813518.slice/crio-1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:33:42 crc kubenswrapper[4921]: I0318 12:33:42.778563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68h69" event={"ID":"97f16f79-7171-482d-b31a-0a204980fbf6","Type":"ContainerStarted","Data":"e548a8515ac525a12f304eb89af42a6fa6a6e913f8703bde588e092196163ce1"} Mar 18 12:33:42 crc kubenswrapper[4921]: I0318 12:33:42.779081 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68h69" event={"ID":"97f16f79-7171-482d-b31a-0a204980fbf6","Type":"ContainerStarted","Data":"3a99270600fec456b747cb324d0dd66f4d478b857e9cd5351c83d990f33f092d"} Mar 18 12:33:42 crc kubenswrapper[4921]: I0318 12:33:42.786845 4921 generic.go:334] "Generic (PLEG): container finished" podID="71615a6c-de34-4a90-a680-f916d9813518" containerID="1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d" exitCode=0 Mar 18 12:33:42 crc kubenswrapper[4921]: I0318 12:33:42.787336 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" event={"ID":"71615a6c-de34-4a90-a680-f916d9813518","Type":"ContainerDied","Data":"1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d"} Mar 18 12:33:42 crc kubenswrapper[4921]: I0318 12:33:42.841774 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-68h69" podStartSLOduration=2.841750395 podStartE2EDuration="2.841750395s" podCreationTimestamp="2026-03-18 12:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:42.801331498 +0000 UTC m=+1442.351252157" watchObservedRunningTime="2026-03-18 12:33:42.841750395 +0000 UTC m=+1442.391671034" Mar 18 12:33:43 crc kubenswrapper[4921]: I0318 12:33:43.806765 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:33:43 crc kubenswrapper[4921]: I0318 12:33:43.816865 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.815524 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" event={"ID":"71615a6c-de34-4a90-a680-f916d9813518","Type":"ContainerStarted","Data":"b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.815930 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.818341 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08118125-56d8-489c-83fb-d54c86aff1d4","Type":"ContainerStarted","Data":"4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.818873 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="08118125-56d8-489c-83fb-d54c86aff1d4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520" gracePeriod=30 Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.820805 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56dcf122-202d-4ead-9bf0-9dd079a704d0","Type":"ContainerStarted","Data":"6478000bb7c1e5c177c7ea148a7f8e31375e79721e0974a5882d10b07ae4499f"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.820862 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56dcf122-202d-4ead-9bf0-9dd079a704d0","Type":"ContainerStarted","Data":"32d9483b1ef29e071dfcef7660f3668f6a7910000f4ea266f4f0821240c9dc76"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.822930 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d5ab6ab-d300-4365-86c6-e1a782e6ee94","Type":"ContainerStarted","Data":"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.822963 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d5ab6ab-d300-4365-86c6-e1a782e6ee94","Type":"ContainerStarted","Data":"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.823102 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-log" containerID="cri-o://b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8" gracePeriod=30 Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.823173 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-metadata" containerID="cri-o://e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a" gracePeriod=30 Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.826017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17799c79-7099-46dd-baf5-444063789a7d","Type":"ContainerStarted","Data":"bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944"} Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.842874 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" podStartSLOduration=5.842856889 podStartE2EDuration="5.842856889s" podCreationTimestamp="2026-03-18 12:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:45.838002952 +0000 UTC m=+1445.387923591" watchObservedRunningTime="2026-03-18 12:33:45.842856889 +0000 UTC m=+1445.392777528" Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.896229 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.325123861 podStartE2EDuration="5.896207534s" podCreationTimestamp="2026-03-18 12:33:40 +0000 UTC" firstStartedPulling="2026-03-18 12:33:41.345633009 +0000 UTC m=+1440.895553638" lastFinishedPulling="2026-03-18 12:33:44.916716662 +0000 UTC m=+1444.466637311" observedRunningTime="2026-03-18 12:33:45.860711276 +0000 UTC m=+1445.410631915" watchObservedRunningTime="2026-03-18 12:33:45.896207534 +0000 UTC m=+1445.446128183" Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.903419 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.552412492 podStartE2EDuration="5.903400238s" podCreationTimestamp="2026-03-18 12:33:40 +0000 UTC" firstStartedPulling="2026-03-18 12:33:41.56549522 +0000 UTC m=+1441.115415869" lastFinishedPulling="2026-03-18 12:33:44.916482976 +0000 UTC m=+1444.466403615" observedRunningTime="2026-03-18 12:33:45.901453483 +0000 UTC m=+1445.451374122" watchObservedRunningTime="2026-03-18 12:33:45.903400238 +0000 UTC m=+1445.453320867" Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.904363 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.929938773 podStartE2EDuration="6.904355465s" podCreationTimestamp="2026-03-18 12:33:39 +0000 UTC" firstStartedPulling="2026-03-18 12:33:40.948778004 +0000 UTC m=+1440.498698643" lastFinishedPulling="2026-03-18 12:33:44.923194696 +0000 UTC m=+1444.473115335" observedRunningTime="2026-03-18 12:33:45.887229699 +0000 UTC m=+1445.437150348" watchObservedRunningTime="2026-03-18 12:33:45.904355465 +0000 UTC m=+1445.454276104" Mar 18 12:33:45 crc kubenswrapper[4921]: I0318 12:33:45.924967 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.117482307 podStartE2EDuration="6.92494363s" podCreationTimestamp="2026-03-18 12:33:39 +0000 UTC" firstStartedPulling="2026-03-18 12:33:41.109032933 +0000 UTC m=+1440.658953572" lastFinishedPulling="2026-03-18 12:33:44.916494266 +0000 UTC m=+1444.466414895" observedRunningTime="2026-03-18 12:33:45.915236174 +0000 UTC m=+1445.465156813" watchObservedRunningTime="2026-03-18 12:33:45.92494363 +0000 UTC m=+1445.474864269" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.427917 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.495488 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-logs\") pod \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.495676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-config-data\") pod \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.495726 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjq4g\" (UniqueName: \"kubernetes.io/projected/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-kube-api-access-tjq4g\") pod \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.495886 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-combined-ca-bundle\") pod \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\" (UID: \"7d5ab6ab-d300-4365-86c6-e1a782e6ee94\") " Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.496626 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-logs" (OuterVolumeSpecName: "logs") pod "7d5ab6ab-d300-4365-86c6-e1a782e6ee94" (UID: "7d5ab6ab-d300-4365-86c6-e1a782e6ee94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.497133 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.508432 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-kube-api-access-tjq4g" (OuterVolumeSpecName: "kube-api-access-tjq4g") pod "7d5ab6ab-d300-4365-86c6-e1a782e6ee94" (UID: "7d5ab6ab-d300-4365-86c6-e1a782e6ee94"). InnerVolumeSpecName "kube-api-access-tjq4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.535221 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d5ab6ab-d300-4365-86c6-e1a782e6ee94" (UID: "7d5ab6ab-d300-4365-86c6-e1a782e6ee94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.559373 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-config-data" (OuterVolumeSpecName: "config-data") pod "7d5ab6ab-d300-4365-86c6-e1a782e6ee94" (UID: "7d5ab6ab-d300-4365-86c6-e1a782e6ee94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.599185 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjq4g\" (UniqueName: \"kubernetes.io/projected/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-kube-api-access-tjq4g\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.599232 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.599249 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5ab6ab-d300-4365-86c6-e1a782e6ee94-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.844602 4921 generic.go:334] "Generic (PLEG): container finished" podID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerID="e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a" exitCode=0 Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.845521 4921 generic.go:334] "Generic (PLEG): container finished" podID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerID="b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8" exitCode=143 Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.844708 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d5ab6ab-d300-4365-86c6-e1a782e6ee94","Type":"ContainerDied","Data":"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a"} Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.845711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d5ab6ab-d300-4365-86c6-e1a782e6ee94","Type":"ContainerDied","Data":"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8"} Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.845734 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d5ab6ab-d300-4365-86c6-e1a782e6ee94","Type":"ContainerDied","Data":"fe54dae3335eba78186320f8806c38f05430c01c9efe206486e163149459ce38"} Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.845761 4921 scope.go:117] "RemoveContainer" containerID="e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.844697 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.886974 4921 scope.go:117] "RemoveContainer" containerID="b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.908814 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.930327 4921 scope.go:117] "RemoveContainer" containerID="e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.930357 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:46 crc kubenswrapper[4921]: E0318 12:33:46.938729 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a\": container with ID starting with e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a not found: ID does not exist" containerID="e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.938780 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a"} err="failed to get container status \"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a\": rpc error: code = NotFound desc = could not find container \"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a\": container with ID starting with e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a not found: ID does not exist" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.938813 4921 scope.go:117] "RemoveContainer" containerID="b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8" Mar 18 12:33:46 crc kubenswrapper[4921]: E0318 12:33:46.939121 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8\": container with ID starting with b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8 not found: ID does not exist" containerID="b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.939151 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8"} err="failed to get container status \"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8\": rpc error: code = NotFound desc = could not find container \"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8\": container with ID starting with b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8 not found: ID does not exist" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.939169 4921 scope.go:117] "RemoveContainer" containerID="e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.943236 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a"} err="failed to get container status \"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a\": rpc error: code = NotFound desc = could not find container \"e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a\": container with ID starting with e8437ba6db25ab9b7c02f8fde9e2f93902b5b97894830ba72e3c412d06d3c38a not found: ID does not exist" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.943298 4921 scope.go:117] "RemoveContainer" containerID="b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.943702 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8"} err="failed to get container status \"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8\": rpc error: code = NotFound desc = could not find container \"b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8\": container with ID starting with b6e435e8c6ab04c4cd3874309a2e6e0b4c52c7793eb1234b05e47d02ec332af8 not found: ID does not exist" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.953056 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:46 crc kubenswrapper[4921]: E0318 12:33:46.953795 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-metadata" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.953826 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-metadata" Mar 18 12:33:46 crc kubenswrapper[4921]: E0318 12:33:46.953893 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-log" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.953910 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-log" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.954254 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-log" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.954289 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" containerName="nova-metadata-metadata" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.956194 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.959323 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.959562 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:33:46 crc kubenswrapper[4921]: I0318 12:33:46.969072 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.131979 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.132136 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-config-data\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.132567 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.132837 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffb32e44-5e60-4419-9600-741cb7d890b8-logs\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.133011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgs8r\" (UniqueName: \"kubernetes.io/projected/ffb32e44-5e60-4419-9600-741cb7d890b8-kube-api-access-fgs8r\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.219641 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5ab6ab-d300-4365-86c6-e1a782e6ee94" path="/var/lib/kubelet/pods/7d5ab6ab-d300-4365-86c6-e1a782e6ee94/volumes" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.235754 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.235824 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffb32e44-5e60-4419-9600-741cb7d890b8-logs\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.235877 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgs8r\" (UniqueName: \"kubernetes.io/projected/ffb32e44-5e60-4419-9600-741cb7d890b8-kube-api-access-fgs8r\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.235928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.235966 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-config-data\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.236958 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffb32e44-5e60-4419-9600-741cb7d890b8-logs\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.247551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.248540 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.251603 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-config-data\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.270934 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgs8r\" (UniqueName: \"kubernetes.io/projected/ffb32e44-5e60-4419-9600-741cb7d890b8-kube-api-access-fgs8r\") pod \"nova-metadata-0\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.348353 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.829965 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:47 crc kubenswrapper[4921]: I0318 12:33:47.861150 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ffb32e44-5e60-4419-9600-741cb7d890b8","Type":"ContainerStarted","Data":"c952bf44ecc1bcb805341bb3873c74acb99534ed30e17021ab95a85c1a46e46d"} Mar 18 12:33:48 crc kubenswrapper[4921]: I0318 12:33:48.873698 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ffb32e44-5e60-4419-9600-741cb7d890b8","Type":"ContainerStarted","Data":"d44f0a3e816d5fea199466afc82b9a04044f84a74a08b3e06277af26ceb763cb"} Mar 18 12:33:48 crc kubenswrapper[4921]: I0318 12:33:48.874094 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ffb32e44-5e60-4419-9600-741cb7d890b8","Type":"ContainerStarted","Data":"d187b68bdd2dce5b73abcfa093454de2e9d28298a66163df261947fdbc5c56e5"} Mar 18 12:33:48 crc kubenswrapper[4921]: I0318 12:33:48.900960 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.900938932 podStartE2EDuration="2.900938932s" podCreationTimestamp="2026-03-18 12:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:48.895418396 +0000 UTC m=+1448.445339045" watchObservedRunningTime="2026-03-18 12:33:48.900938932 +0000 UTC m=+1448.450859591" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.083939 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvj5"] Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.086367 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.099458 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvj5"] Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.174344 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-utilities\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.174504 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-catalog-content\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.174617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vz6\" (UniqueName: \"kubernetes.io/projected/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-kube-api-access-p5vz6\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.275871 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vz6\" (UniqueName: \"kubernetes.io/projected/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-kube-api-access-p5vz6\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.276660 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-utilities\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.278678 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-catalog-content\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.277232 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-utilities\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.279292 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-catalog-content\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.310084 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vz6\" (UniqueName: \"kubernetes.io/projected/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-kube-api-access-p5vz6\") pod \"redhat-marketplace-4pvj5\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.428822 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:49 crc kubenswrapper[4921]: W0318 12:33:49.892577 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10885b1e_9d73_4fcd_b0da_5e73707e9c6b.slice/crio-c22956439338bfbb1f2a15db32aa0b56a330e7de695fd40b835cb1acc0bead11 WatchSource:0}: Error finding container c22956439338bfbb1f2a15db32aa0b56a330e7de695fd40b835cb1acc0bead11: Status 404 returned error can't find the container with id c22956439338bfbb1f2a15db32aa0b56a330e7de695fd40b835cb1acc0bead11 Mar 18 12:33:49 crc kubenswrapper[4921]: I0318 12:33:49.901971 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvj5"] Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.405028 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.601148 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.601213 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.632157 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.632490 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.661534 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.662730 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.756847 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-phf9s"] Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.757267 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" containerName="dnsmasq-dns" containerID="cri-o://c07da51ea5d42b20509a8bfcec21fdd4d030ff5c491ccc9255b87fb76712f053" gracePeriod=10 Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.894400 4921 generic.go:334] "Generic (PLEG): container finished" podID="97f16f79-7171-482d-b31a-0a204980fbf6" containerID="e548a8515ac525a12f304eb89af42a6fa6a6e913f8703bde588e092196163ce1" exitCode=0 Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.894500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68h69" event={"ID":"97f16f79-7171-482d-b31a-0a204980fbf6","Type":"ContainerDied","Data":"e548a8515ac525a12f304eb89af42a6fa6a6e913f8703bde588e092196163ce1"} Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.901174 4921 generic.go:334] "Generic (PLEG): container finished" podID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerID="73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2" exitCode=0 Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.901312 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvj5" event={"ID":"10885b1e-9d73-4fcd-b0da-5e73707e9c6b","Type":"ContainerDied","Data":"73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2"} Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.901386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvj5" event={"ID":"10885b1e-9d73-4fcd-b0da-5e73707e9c6b","Type":"ContainerStarted","Data":"c22956439338bfbb1f2a15db32aa0b56a330e7de695fd40b835cb1acc0bead11"} Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.933639 4921 generic.go:334] "Generic (PLEG): container finished" podID="b6ba9a5c-c719-470a-b047-437082f292d6" containerID="c07da51ea5d42b20509a8bfcec21fdd4d030ff5c491ccc9255b87fb76712f053" exitCode=0 Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.933958 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" event={"ID":"b6ba9a5c-c719-470a-b047-437082f292d6","Type":"ContainerDied","Data":"c07da51ea5d42b20509a8bfcec21fdd4d030ff5c491ccc9255b87fb76712f053"} Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.950072 4921 generic.go:334] "Generic (PLEG): container finished" podID="17db594b-8493-4668-ab55-ed7c9f41db14" containerID="9632959f08bc3cd646cbe6c91fa73851a145e8b01415f795c58c8ee45b1d10bb" exitCode=0 Mar 18 12:33:50 crc kubenswrapper[4921]: I0318 12:33:50.950768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jn2s5" event={"ID":"17db594b-8493-4668-ab55-ed7c9f41db14","Type":"ContainerDied","Data":"9632959f08bc3cd646cbe6c91fa73851a145e8b01415f795c58c8ee45b1d10bb"} Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.022921 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.353346 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.448523 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-svc\") pod \"b6ba9a5c-c719-470a-b047-437082f292d6\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.448717 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-swift-storage-0\") pod \"b6ba9a5c-c719-470a-b047-437082f292d6\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.448826 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-sb\") pod \"b6ba9a5c-c719-470a-b047-437082f292d6\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.448854 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-config\") pod \"b6ba9a5c-c719-470a-b047-437082f292d6\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.448944 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-nb\") pod \"b6ba9a5c-c719-470a-b047-437082f292d6\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.449021 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gngpq\" (UniqueName: \"kubernetes.io/projected/b6ba9a5c-c719-470a-b047-437082f292d6-kube-api-access-gngpq\") pod \"b6ba9a5c-c719-470a-b047-437082f292d6\" (UID: \"b6ba9a5c-c719-470a-b047-437082f292d6\") " Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.457409 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ba9a5c-c719-470a-b047-437082f292d6-kube-api-access-gngpq" (OuterVolumeSpecName: "kube-api-access-gngpq") pod "b6ba9a5c-c719-470a-b047-437082f292d6" (UID: "b6ba9a5c-c719-470a-b047-437082f292d6"). InnerVolumeSpecName "kube-api-access-gngpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.526704 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6ba9a5c-c719-470a-b047-437082f292d6" (UID: "b6ba9a5c-c719-470a-b047-437082f292d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.535421 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6ba9a5c-c719-470a-b047-437082f292d6" (UID: "b6ba9a5c-c719-470a-b047-437082f292d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.535784 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6ba9a5c-c719-470a-b047-437082f292d6" (UID: "b6ba9a5c-c719-470a-b047-437082f292d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.550220 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-config" (OuterVolumeSpecName: "config") pod "b6ba9a5c-c719-470a-b047-437082f292d6" (UID: "b6ba9a5c-c719-470a-b047-437082f292d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.551404 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.551426 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gngpq\" (UniqueName: \"kubernetes.io/projected/b6ba9a5c-c719-470a-b047-437082f292d6-kube-api-access-gngpq\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.551435 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.551445 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.551452 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.580721 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6ba9a5c-c719-470a-b047-437082f292d6" (UID: "b6ba9a5c-c719-470a-b047-437082f292d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.652776 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6ba9a5c-c719-470a-b047-437082f292d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.683329 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.683372 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.959930 4921 generic.go:334] "Generic (PLEG): container finished" podID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerID="bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a" exitCode=0 Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.960029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvj5" event={"ID":"10885b1e-9d73-4fcd-b0da-5e73707e9c6b","Type":"ContainerDied","Data":"bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a"} Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.962662 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.964275 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-phf9s" event={"ID":"b6ba9a5c-c719-470a-b047-437082f292d6","Type":"ContainerDied","Data":"9954f490475fc861bd4f6ef8c65791bb210ef59a5589c5fc93f477b64832ea51"} Mar 18 12:33:51 crc kubenswrapper[4921]: I0318 12:33:51.964323 4921 scope.go:117] "RemoveContainer" containerID="c07da51ea5d42b20509a8bfcec21fdd4d030ff5c491ccc9255b87fb76712f053" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.039516 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-phf9s"] Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.039821 4921 scope.go:117] "RemoveContainer" containerID="56cfbcdbf457274044cc596b4a976b5f0345edae0248288903ae6d4e5cdb0409" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.056240 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-phf9s"] Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.476706 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.478644 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk58j\" (UniqueName: \"kubernetes.io/projected/97f16f79-7171-482d-b31a-0a204980fbf6-kube-api-access-nk58j\") pod \"97f16f79-7171-482d-b31a-0a204980fbf6\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.478751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-config-data\") pod \"97f16f79-7171-482d-b31a-0a204980fbf6\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.478802 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-combined-ca-bundle\") pod \"97f16f79-7171-482d-b31a-0a204980fbf6\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.478859 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-scripts\") pod \"97f16f79-7171-482d-b31a-0a204980fbf6\" (UID: \"97f16f79-7171-482d-b31a-0a204980fbf6\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.485325 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f16f79-7171-482d-b31a-0a204980fbf6-kube-api-access-nk58j" (OuterVolumeSpecName: "kube-api-access-nk58j") pod "97f16f79-7171-482d-b31a-0a204980fbf6" (UID: "97f16f79-7171-482d-b31a-0a204980fbf6"). InnerVolumeSpecName "kube-api-access-nk58j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.485616 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-scripts" (OuterVolumeSpecName: "scripts") pod "97f16f79-7171-482d-b31a-0a204980fbf6" (UID: "97f16f79-7171-482d-b31a-0a204980fbf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.504733 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.529316 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-config-data" (OuterVolumeSpecName: "config-data") pod "97f16f79-7171-482d-b31a-0a204980fbf6" (UID: "97f16f79-7171-482d-b31a-0a204980fbf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.566595 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97f16f79-7171-482d-b31a-0a204980fbf6" (UID: "97f16f79-7171-482d-b31a-0a204980fbf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.580388 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.580426 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk58j\" (UniqueName: \"kubernetes.io/projected/97f16f79-7171-482d-b31a-0a204980fbf6-kube-api-access-nk58j\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.580438 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.580449 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97f16f79-7171-482d-b31a-0a204980fbf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.681940 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-config-data\") pod \"17db594b-8493-4668-ab55-ed7c9f41db14\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.682031 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-combined-ca-bundle\") pod \"17db594b-8493-4668-ab55-ed7c9f41db14\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.682187 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fpqd\" (UniqueName: \"kubernetes.io/projected/17db594b-8493-4668-ab55-ed7c9f41db14-kube-api-access-5fpqd\") pod \"17db594b-8493-4668-ab55-ed7c9f41db14\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.682295 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-scripts\") pod \"17db594b-8493-4668-ab55-ed7c9f41db14\" (UID: \"17db594b-8493-4668-ab55-ed7c9f41db14\") " Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.686518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17db594b-8493-4668-ab55-ed7c9f41db14-kube-api-access-5fpqd" (OuterVolumeSpecName: "kube-api-access-5fpqd") pod "17db594b-8493-4668-ab55-ed7c9f41db14" (UID: "17db594b-8493-4668-ab55-ed7c9f41db14"). InnerVolumeSpecName "kube-api-access-5fpqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.687992 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-scripts" (OuterVolumeSpecName: "scripts") pod "17db594b-8493-4668-ab55-ed7c9f41db14" (UID: "17db594b-8493-4668-ab55-ed7c9f41db14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.711018 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17db594b-8493-4668-ab55-ed7c9f41db14" (UID: "17db594b-8493-4668-ab55-ed7c9f41db14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.713408 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-config-data" (OuterVolumeSpecName: "config-data") pod "17db594b-8493-4668-ab55-ed7c9f41db14" (UID: "17db594b-8493-4668-ab55-ed7c9f41db14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.784795 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fpqd\" (UniqueName: \"kubernetes.io/projected/17db594b-8493-4668-ab55-ed7c9f41db14-kube-api-access-5fpqd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.784830 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.784839 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:52 crc kubenswrapper[4921]: I0318 12:33:52.784847 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17db594b-8493-4668-ab55-ed7c9f41db14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:52.999839 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvj5" event={"ID":"10885b1e-9d73-4fcd-b0da-5e73707e9c6b","Type":"ContainerStarted","Data":"1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578"} Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.006062 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jn2s5" event={"ID":"17db594b-8493-4668-ab55-ed7c9f41db14","Type":"ContainerDied","Data":"6e02c617c9b023cbd2d1119d4c915dfe2ca82bbf6f358687f4527960fa730f0d"} Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.006101 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e02c617c9b023cbd2d1119d4c915dfe2ca82bbf6f358687f4527960fa730f0d" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.006178 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jn2s5" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.015872 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-68h69" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.015938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-68h69" event={"ID":"97f16f79-7171-482d-b31a-0a204980fbf6","Type":"ContainerDied","Data":"3a99270600fec456b747cb324d0dd66f4d478b857e9cd5351c83d990f33f092d"} Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.015962 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a99270600fec456b747cb324d0dd66f4d478b857e9cd5351c83d990f33f092d" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.051280 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:33:53 crc kubenswrapper[4921]: E0318 12:33:53.051964 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f16f79-7171-482d-b31a-0a204980fbf6" containerName="nova-cell1-conductor-db-sync" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.051980 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f16f79-7171-482d-b31a-0a204980fbf6" containerName="nova-cell1-conductor-db-sync" Mar 18 12:33:53 crc kubenswrapper[4921]: E0318 12:33:53.052032 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" containerName="init" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.052063 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" containerName="init" Mar 18 12:33:53 crc kubenswrapper[4921]: E0318 12:33:53.052084 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17db594b-8493-4668-ab55-ed7c9f41db14" containerName="nova-manage" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.052092 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="17db594b-8493-4668-ab55-ed7c9f41db14" containerName="nova-manage" Mar 18 12:33:53 crc kubenswrapper[4921]: E0318 12:33:53.052146 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" containerName="dnsmasq-dns" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.052153 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" containerName="dnsmasq-dns" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.052436 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f16f79-7171-482d-b31a-0a204980fbf6" containerName="nova-cell1-conductor-db-sync" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.052456 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="17db594b-8493-4668-ab55-ed7c9f41db14" containerName="nova-manage" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.052479 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" containerName="dnsmasq-dns" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.053472 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.057342 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.061185 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4pvj5" podStartSLOduration=2.577727 podStartE2EDuration="4.061163567s" podCreationTimestamp="2026-03-18 12:33:49 +0000 UTC" firstStartedPulling="2026-03-18 12:33:50.908405003 +0000 UTC m=+1450.458325642" lastFinishedPulling="2026-03-18 12:33:52.39184157 +0000 UTC m=+1451.941762209" observedRunningTime="2026-03-18 12:33:53.02178731 +0000 UTC m=+1452.571707949" watchObservedRunningTime="2026-03-18 12:33:53.061163567 +0000 UTC m=+1452.611084206" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.085013 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.188553 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.188826 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-log" containerID="cri-o://32d9483b1ef29e071dfcef7660f3668f6a7910000f4ea266f4f0821240c9dc76" gracePeriod=30 Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.188985 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-api" containerID="cri-o://6478000bb7c1e5c177c7ea148a7f8e31375e79721e0974a5882d10b07ae4499f" gracePeriod=30 Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.194788 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.194859 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgvnj\" (UniqueName: \"kubernetes.io/projected/1a183a61-e314-4bd0-b332-3d216d70c6c2-kube-api-access-zgvnj\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.195052 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.206475 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.221168 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ba9a5c-c719-470a-b047-437082f292d6" path="/var/lib/kubelet/pods/b6ba9a5c-c719-470a-b047-437082f292d6/volumes" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.222788 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.224365 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-log" containerID="cri-o://d187b68bdd2dce5b73abcfa093454de2e9d28298a66163df261947fdbc5c56e5" gracePeriod=30 Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.224547 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-metadata" containerID="cri-o://d44f0a3e816d5fea199466afc82b9a04044f84a74a08b3e06277af26ceb763cb" gracePeriod=30 Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.297188 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.297490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgvnj\" (UniqueName: \"kubernetes.io/projected/1a183a61-e314-4bd0-b332-3d216d70c6c2-kube-api-access-zgvnj\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.297736 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.302830 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.302941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.322643 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgvnj\" (UniqueName: \"kubernetes.io/projected/1a183a61-e314-4bd0-b332-3d216d70c6c2-kube-api-access-zgvnj\") pod \"nova-cell1-conductor-0\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.370339 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:53 crc kubenswrapper[4921]: I0318 12:33:53.882083 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.026808 4921 generic.go:334] "Generic (PLEG): container finished" podID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerID="d44f0a3e816d5fea199466afc82b9a04044f84a74a08b3e06277af26ceb763cb" exitCode=0 Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.026860 4921 generic.go:334] "Generic (PLEG): container finished" podID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerID="d187b68bdd2dce5b73abcfa093454de2e9d28298a66163df261947fdbc5c56e5" exitCode=143 Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.026890 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ffb32e44-5e60-4419-9600-741cb7d890b8","Type":"ContainerDied","Data":"d44f0a3e816d5fea199466afc82b9a04044f84a74a08b3e06277af26ceb763cb"} Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.026952 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ffb32e44-5e60-4419-9600-741cb7d890b8","Type":"ContainerDied","Data":"d187b68bdd2dce5b73abcfa093454de2e9d28298a66163df261947fdbc5c56e5"} Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.028791 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1a183a61-e314-4bd0-b332-3d216d70c6c2","Type":"ContainerStarted","Data":"2a02e55df0f22299348136d542d6bb374bcee9d78d5782a167168731d958a271"} Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.033098 4921 generic.go:334] "Generic (PLEG): container finished" podID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerID="32d9483b1ef29e071dfcef7660f3668f6a7910000f4ea266f4f0821240c9dc76" exitCode=143 Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.033204 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56dcf122-202d-4ead-9bf0-9dd079a704d0","Type":"ContainerDied","Data":"32d9483b1ef29e071dfcef7660f3668f6a7910000f4ea266f4f0821240c9dc76"} Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.033590 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="17799c79-7099-46dd-baf5-444063789a7d" containerName="nova-scheduler-scheduler" containerID="cri-o://bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" gracePeriod=30 Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.385853 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.520465 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-nova-metadata-tls-certs\") pod \"ffb32e44-5e60-4419-9600-741cb7d890b8\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.520521 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-combined-ca-bundle\") pod \"ffb32e44-5e60-4419-9600-741cb7d890b8\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.520633 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgs8r\" (UniqueName: \"kubernetes.io/projected/ffb32e44-5e60-4419-9600-741cb7d890b8-kube-api-access-fgs8r\") pod \"ffb32e44-5e60-4419-9600-741cb7d890b8\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.520655 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffb32e44-5e60-4419-9600-741cb7d890b8-logs\") pod \"ffb32e44-5e60-4419-9600-741cb7d890b8\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.520768 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-config-data\") pod \"ffb32e44-5e60-4419-9600-741cb7d890b8\" (UID: \"ffb32e44-5e60-4419-9600-741cb7d890b8\") " Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.523473 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffb32e44-5e60-4419-9600-741cb7d890b8-logs" (OuterVolumeSpecName: "logs") pod "ffb32e44-5e60-4419-9600-741cb7d890b8" (UID: "ffb32e44-5e60-4419-9600-741cb7d890b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.526054 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb32e44-5e60-4419-9600-741cb7d890b8-kube-api-access-fgs8r" (OuterVolumeSpecName: "kube-api-access-fgs8r") pod "ffb32e44-5e60-4419-9600-741cb7d890b8" (UID: "ffb32e44-5e60-4419-9600-741cb7d890b8"). InnerVolumeSpecName "kube-api-access-fgs8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.550017 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-config-data" (OuterVolumeSpecName: "config-data") pod "ffb32e44-5e60-4419-9600-741cb7d890b8" (UID: "ffb32e44-5e60-4419-9600-741cb7d890b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.551444 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffb32e44-5e60-4419-9600-741cb7d890b8" (UID: "ffb32e44-5e60-4419-9600-741cb7d890b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.581575 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ffb32e44-5e60-4419-9600-741cb7d890b8" (UID: "ffb32e44-5e60-4419-9600-741cb7d890b8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.623504 4921 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.623541 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.623585 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgs8r\" (UniqueName: \"kubernetes.io/projected/ffb32e44-5e60-4419-9600-741cb7d890b8-kube-api-access-fgs8r\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.623594 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffb32e44-5e60-4419-9600-741cb7d890b8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4921]: I0318 12:33:54.623603 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb32e44-5e60-4419-9600-741cb7d890b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.065780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ffb32e44-5e60-4419-9600-741cb7d890b8","Type":"ContainerDied","Data":"c952bf44ecc1bcb805341bb3873c74acb99534ed30e17021ab95a85c1a46e46d"} Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.066175 4921 scope.go:117] "RemoveContainer" containerID="d44f0a3e816d5fea199466afc82b9a04044f84a74a08b3e06277af26ceb763cb" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.065803 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.068960 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1a183a61-e314-4bd0-b332-3d216d70c6c2","Type":"ContainerStarted","Data":"3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b"} Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.069288 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.095416 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.095394918 podStartE2EDuration="2.095394918s" podCreationTimestamp="2026-03-18 12:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:55.091327413 +0000 UTC m=+1454.641248072" watchObservedRunningTime="2026-03-18 12:33:55.095394918 +0000 UTC m=+1454.645315567" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.117148 4921 scope.go:117] "RemoveContainer" containerID="d187b68bdd2dce5b73abcfa093454de2e9d28298a66163df261947fdbc5c56e5" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.125157 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.140011 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.152346 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:55 crc kubenswrapper[4921]: E0318 12:33:55.152809 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-metadata" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.152835 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-metadata" Mar 18 12:33:55 crc kubenswrapper[4921]: E0318 12:33:55.152869 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-log" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.152879 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-log" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.153092 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-metadata" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.153422 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" containerName="nova-metadata-log" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.154634 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.157048 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.163652 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.198393 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.266763 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb32e44-5e60-4419-9600-741cb7d890b8" path="/var/lib/kubelet/pods/ffb32e44-5e60-4419-9600-741cb7d890b8/volumes" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.352625 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-config-data\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.352687 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107bb63f-8854-49d1-b634-ef35890935c5-logs\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.352717 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5dss\" (UniqueName: \"kubernetes.io/projected/107bb63f-8854-49d1-b634-ef35890935c5-kube-api-access-h5dss\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.352799 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.352877 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.453901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.453969 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-config-data\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.453995 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107bb63f-8854-49d1-b634-ef35890935c5-logs\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.454020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5dss\" (UniqueName: \"kubernetes.io/projected/107bb63f-8854-49d1-b634-ef35890935c5-kube-api-access-h5dss\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.454103 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.454802 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107bb63f-8854-49d1-b634-ef35890935c5-logs\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.458184 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.458399 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-config-data\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.467188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.473649 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5dss\" (UniqueName: \"kubernetes.io/projected/107bb63f-8854-49d1-b634-ef35890935c5-kube-api-access-h5dss\") pod \"nova-metadata-0\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " pod="openstack/nova-metadata-0" Mar 18 12:33:55 crc kubenswrapper[4921]: E0318 12:33:55.633983 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:33:55 crc kubenswrapper[4921]: E0318 12:33:55.635258 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:33:55 crc kubenswrapper[4921]: E0318 12:33:55.637072 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:33:55 crc kubenswrapper[4921]: E0318 12:33:55.637158 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="17799c79-7099-46dd-baf5-444063789a7d" containerName="nova-scheduler-scheduler" Mar 18 12:33:55 crc kubenswrapper[4921]: I0318 12:33:55.773582 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:33:56 crc kubenswrapper[4921]: I0318 12:33:56.215076 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.091071 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"107bb63f-8854-49d1-b634-ef35890935c5","Type":"ContainerStarted","Data":"31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732"} Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.091348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"107bb63f-8854-49d1-b634-ef35890935c5","Type":"ContainerStarted","Data":"560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4"} Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.091360 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"107bb63f-8854-49d1-b634-ef35890935c5","Type":"ContainerStarted","Data":"de9c6e0fdf2d7abccd273af4842f09d6f62f568158ab6ecbdfeb91313ebeeec6"} Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.122447 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.122417984 podStartE2EDuration="2.122417984s" podCreationTimestamp="2026-03-18 12:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:57.107376597 +0000 UTC m=+1456.657297246" watchObservedRunningTime="2026-03-18 12:33:57.122417984 +0000 UTC m=+1456.672338643" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.596253 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.706946 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-config-data\") pod \"17799c79-7099-46dd-baf5-444063789a7d\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.707161 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-combined-ca-bundle\") pod \"17799c79-7099-46dd-baf5-444063789a7d\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.707252 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mvz7\" (UniqueName: \"kubernetes.io/projected/17799c79-7099-46dd-baf5-444063789a7d-kube-api-access-9mvz7\") pod \"17799c79-7099-46dd-baf5-444063789a7d\" (UID: \"17799c79-7099-46dd-baf5-444063789a7d\") " Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.725704 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17799c79-7099-46dd-baf5-444063789a7d-kube-api-access-9mvz7" (OuterVolumeSpecName: "kube-api-access-9mvz7") pod "17799c79-7099-46dd-baf5-444063789a7d" (UID: "17799c79-7099-46dd-baf5-444063789a7d"). InnerVolumeSpecName "kube-api-access-9mvz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.739617 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-config-data" (OuterVolumeSpecName: "config-data") pod "17799c79-7099-46dd-baf5-444063789a7d" (UID: "17799c79-7099-46dd-baf5-444063789a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.742482 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17799c79-7099-46dd-baf5-444063789a7d" (UID: "17799c79-7099-46dd-baf5-444063789a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.810781 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.811122 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17799c79-7099-46dd-baf5-444063789a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:57 crc kubenswrapper[4921]: I0318 12:33:57.811133 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mvz7\" (UniqueName: \"kubernetes.io/projected/17799c79-7099-46dd-baf5-444063789a7d-kube-api-access-9mvz7\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.100811 4921 generic.go:334] "Generic (PLEG): container finished" podID="17799c79-7099-46dd-baf5-444063789a7d" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" exitCode=0 Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.100881 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.100902 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17799c79-7099-46dd-baf5-444063789a7d","Type":"ContainerDied","Data":"bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944"} Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.100944 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17799c79-7099-46dd-baf5-444063789a7d","Type":"ContainerDied","Data":"5e123e52b3c5393a9ccb048b038ab48d1726d02f536f6dcf5fc16ffe8c876454"} Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.100966 4921 scope.go:117] "RemoveContainer" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.104554 4921 generic.go:334] "Generic (PLEG): container finished" podID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerID="6478000bb7c1e5c177c7ea148a7f8e31375e79721e0974a5882d10b07ae4499f" exitCode=0 Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.104617 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56dcf122-202d-4ead-9bf0-9dd079a704d0","Type":"ContainerDied","Data":"6478000bb7c1e5c177c7ea148a7f8e31375e79721e0974a5882d10b07ae4499f"} Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.126256 4921 scope.go:117] "RemoveContainer" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" Mar 18 12:33:58 crc kubenswrapper[4921]: E0318 12:33:58.130309 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944\": container with ID starting with bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944 not found: ID does not exist" containerID="bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.130372 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944"} err="failed to get container status \"bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944\": rpc error: code = NotFound desc = could not find container \"bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944\": container with ID starting with bad90c78f99d56d92c2060a6655dfe59a76e5308af810fe256182e6c696ed944 not found: ID does not exist" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.141487 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.159011 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.170921 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:58 crc kubenswrapper[4921]: E0318 12:33:58.171429 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17799c79-7099-46dd-baf5-444063789a7d" containerName="nova-scheduler-scheduler" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.171454 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="17799c79-7099-46dd-baf5-444063789a7d" containerName="nova-scheduler-scheduler" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.171675 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="17799c79-7099-46dd-baf5-444063789a7d" containerName="nova-scheduler-scheduler" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.172364 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.176102 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.181170 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.221439 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-config-data\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.221480 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glnwk\" (UniqueName: \"kubernetes.io/projected/73d4aa03-5839-43ea-803e-64d12b544e1e-kube-api-access-glnwk\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.221762 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.323567 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.323719 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-config-data\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.323739 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glnwk\" (UniqueName: \"kubernetes.io/projected/73d4aa03-5839-43ea-803e-64d12b544e1e-kube-api-access-glnwk\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.327787 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-config-data\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.327967 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.344834 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glnwk\" (UniqueName: \"kubernetes.io/projected/73d4aa03-5839-43ea-803e-64d12b544e1e-kube-api-access-glnwk\") pod \"nova-scheduler-0\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.429121 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.496440 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.627496 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-config-data\") pod \"56dcf122-202d-4ead-9bf0-9dd079a704d0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.627995 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68rfs\" (UniqueName: \"kubernetes.io/projected/56dcf122-202d-4ead-9bf0-9dd079a704d0-kube-api-access-68rfs\") pod \"56dcf122-202d-4ead-9bf0-9dd079a704d0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.628161 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56dcf122-202d-4ead-9bf0-9dd079a704d0-logs\") pod \"56dcf122-202d-4ead-9bf0-9dd079a704d0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.628196 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-combined-ca-bundle\") pod \"56dcf122-202d-4ead-9bf0-9dd079a704d0\" (UID: \"56dcf122-202d-4ead-9bf0-9dd079a704d0\") " Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.629864 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56dcf122-202d-4ead-9bf0-9dd079a704d0-logs" (OuterVolumeSpecName: "logs") pod "56dcf122-202d-4ead-9bf0-9dd079a704d0" (UID: "56dcf122-202d-4ead-9bf0-9dd079a704d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.634063 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56dcf122-202d-4ead-9bf0-9dd079a704d0-kube-api-access-68rfs" (OuterVolumeSpecName: "kube-api-access-68rfs") pod "56dcf122-202d-4ead-9bf0-9dd079a704d0" (UID: "56dcf122-202d-4ead-9bf0-9dd079a704d0"). InnerVolumeSpecName "kube-api-access-68rfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.656627 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56dcf122-202d-4ead-9bf0-9dd079a704d0" (UID: "56dcf122-202d-4ead-9bf0-9dd079a704d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.658609 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-config-data" (OuterVolumeSpecName: "config-data") pod "56dcf122-202d-4ead-9bf0-9dd079a704d0" (UID: "56dcf122-202d-4ead-9bf0-9dd079a704d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.730479 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.730516 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56dcf122-202d-4ead-9bf0-9dd079a704d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.730526 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68rfs\" (UniqueName: \"kubernetes.io/projected/56dcf122-202d-4ead-9bf0-9dd079a704d0-kube-api-access-68rfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.730538 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56dcf122-202d-4ead-9bf0-9dd079a704d0-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4921]: W0318 12:33:58.982802 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73d4aa03_5839_43ea_803e_64d12b544e1e.slice/crio-ab4c585082b022c0a9cd9408a8cbe886894ff9fe187223d42b92de33d41b8b49 WatchSource:0}: Error finding container ab4c585082b022c0a9cd9408a8cbe886894ff9fe187223d42b92de33d41b8b49: Status 404 returned error can't find the container with id ab4c585082b022c0a9cd9408a8cbe886894ff9fe187223d42b92de33d41b8b49 Mar 18 12:33:58 crc kubenswrapper[4921]: I0318 12:33:58.990153 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.118144 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.118103 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56dcf122-202d-4ead-9bf0-9dd079a704d0","Type":"ContainerDied","Data":"f75a3d242d0efd233377ee105c20e456017cd611da35b7152981cb37077b9e7b"} Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.118579 4921 scope.go:117] "RemoveContainer" containerID="6478000bb7c1e5c177c7ea148a7f8e31375e79721e0974a5882d10b07ae4499f" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.123748 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"73d4aa03-5839-43ea-803e-64d12b544e1e","Type":"ContainerStarted","Data":"ab4c585082b022c0a9cd9408a8cbe886894ff9fe187223d42b92de33d41b8b49"} Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.155182 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.155327 4921 scope.go:117] "RemoveContainer" containerID="32d9483b1ef29e071dfcef7660f3668f6a7910000f4ea266f4f0821240c9dc76" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.187717 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.199174 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:59 crc kubenswrapper[4921]: E0318 12:33:59.199638 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-log" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.199656 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-log" Mar 18 12:33:59 crc kubenswrapper[4921]: E0318 12:33:59.199693 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-api" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.199700 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-api" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.199924 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-log" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.199947 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" containerName="nova-api-api" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.200936 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.206620 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.206728 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.230426 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17799c79-7099-46dd-baf5-444063789a7d" path="/var/lib/kubelet/pods/17799c79-7099-46dd-baf5-444063789a7d/volumes" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.230992 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56dcf122-202d-4ead-9bf0-9dd079a704d0" path="/var/lib/kubelet/pods/56dcf122-202d-4ead-9bf0-9dd079a704d0/volumes" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.244339 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-logs\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.245166 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.245461 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-config-data\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.245754 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5vm\" (UniqueName: \"kubernetes.io/projected/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-kube-api-access-mx5vm\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.347673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-logs\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.348005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.348145 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-config-data\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.348288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5vm\" (UniqueName: \"kubernetes.io/projected/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-kube-api-access-mx5vm\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.348180 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-logs\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.352550 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.353475 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-config-data\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.366835 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5vm\" (UniqueName: \"kubernetes.io/projected/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-kube-api-access-mx5vm\") pod \"nova-api-0\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.430449 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.430507 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.477210 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.527172 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:33:59 crc kubenswrapper[4921]: I0318 12:33:59.993697 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:00 crc kubenswrapper[4921]: W0318 12:34:00.005522 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba4644c8_fadc_48dd_affb_ea89a4ead9ca.slice/crio-b198f2358b908b5cec69ae580dac0bd58222fec62cf358f3b18029df1e40cb9c WatchSource:0}: Error finding container b198f2358b908b5cec69ae580dac0bd58222fec62cf358f3b18029df1e40cb9c: Status 404 returned error can't find the container with id b198f2358b908b5cec69ae580dac0bd58222fec62cf358f3b18029df1e40cb9c Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.134827 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563954-2bkbb"] Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.138150 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.141304 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.141321 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.141987 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.156376 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-2bkbb"] Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.160259 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4644c8-fadc-48dd-affb-ea89a4ead9ca","Type":"ContainerStarted","Data":"b198f2358b908b5cec69ae580dac0bd58222fec62cf358f3b18029df1e40cb9c"} Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.167711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"73d4aa03-5839-43ea-803e-64d12b544e1e","Type":"ContainerStarted","Data":"45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4"} Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.186858 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.186839356 podStartE2EDuration="2.186839356s" podCreationTimestamp="2026-03-18 12:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:00.186003342 +0000 UTC m=+1459.735923981" watchObservedRunningTime="2026-03-18 12:34:00.186839356 +0000 UTC m=+1459.736759995" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.222476 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.263826 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqp4g\" (UniqueName: \"kubernetes.io/projected/6c18cacd-e41a-4e03-ac32-0633e90d60c1-kube-api-access-rqp4g\") pod \"auto-csr-approver-29563954-2bkbb\" (UID: \"6c18cacd-e41a-4e03-ac32-0633e90d60c1\") " pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.270825 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvj5"] Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.365025 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqp4g\" (UniqueName: \"kubernetes.io/projected/6c18cacd-e41a-4e03-ac32-0633e90d60c1-kube-api-access-rqp4g\") pod \"auto-csr-approver-29563954-2bkbb\" (UID: \"6c18cacd-e41a-4e03-ac32-0633e90d60c1\") " pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.383810 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqp4g\" (UniqueName: \"kubernetes.io/projected/6c18cacd-e41a-4e03-ac32-0633e90d60c1-kube-api-access-rqp4g\") pod \"auto-csr-approver-29563954-2bkbb\" (UID: \"6c18cacd-e41a-4e03-ac32-0633e90d60c1\") " pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.473420 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:00 crc kubenswrapper[4921]: W0318 12:34:00.963714 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c18cacd_e41a_4e03_ac32_0633e90d60c1.slice/crio-3aae52b5c1bfb5acac9fb9d5ae1bf7242c076aa0ca1f238aab6364a4d2cddddd WatchSource:0}: Error finding container 3aae52b5c1bfb5acac9fb9d5ae1bf7242c076aa0ca1f238aab6364a4d2cddddd: Status 404 returned error can't find the container with id 3aae52b5c1bfb5acac9fb9d5ae1bf7242c076aa0ca1f238aab6364a4d2cddddd Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.967463 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:34:00 crc kubenswrapper[4921]: I0318 12:34:00.971965 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-2bkbb"] Mar 18 12:34:01 crc kubenswrapper[4921]: I0318 12:34:01.178131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" event={"ID":"6c18cacd-e41a-4e03-ac32-0633e90d60c1","Type":"ContainerStarted","Data":"3aae52b5c1bfb5acac9fb9d5ae1bf7242c076aa0ca1f238aab6364a4d2cddddd"} Mar 18 12:34:01 crc kubenswrapper[4921]: I0318 12:34:01.181518 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4644c8-fadc-48dd-affb-ea89a4ead9ca","Type":"ContainerStarted","Data":"5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa"} Mar 18 12:34:01 crc kubenswrapper[4921]: I0318 12:34:01.181561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4644c8-fadc-48dd-affb-ea89a4ead9ca","Type":"ContainerStarted","Data":"bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b"} Mar 18 12:34:01 crc kubenswrapper[4921]: I0318 12:34:01.206561 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.20654186 podStartE2EDuration="2.20654186s" podCreationTimestamp="2026-03-18 12:33:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:01.199317815 +0000 UTC m=+1460.749238474" watchObservedRunningTime="2026-03-18 12:34:01.20654186 +0000 UTC m=+1460.756462499" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.190576 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4pvj5" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="registry-server" containerID="cri-o://1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578" gracePeriod=2 Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.704347 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.718296 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-catalog-content\") pod \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.718360 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vz6\" (UniqueName: \"kubernetes.io/projected/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-kube-api-access-p5vz6\") pod \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.718422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-utilities\") pod \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\" (UID: \"10885b1e-9d73-4fcd-b0da-5e73707e9c6b\") " Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.728603 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-utilities" (OuterVolumeSpecName: "utilities") pod "10885b1e-9d73-4fcd-b0da-5e73707e9c6b" (UID: "10885b1e-9d73-4fcd-b0da-5e73707e9c6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.730807 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-kube-api-access-p5vz6" (OuterVolumeSpecName: "kube-api-access-p5vz6") pod "10885b1e-9d73-4fcd-b0da-5e73707e9c6b" (UID: "10885b1e-9d73-4fcd-b0da-5e73707e9c6b"). InnerVolumeSpecName "kube-api-access-p5vz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.758469 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10885b1e-9d73-4fcd-b0da-5e73707e9c6b" (UID: "10885b1e-9d73-4fcd-b0da-5e73707e9c6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.819313 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.819346 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5vz6\" (UniqueName: \"kubernetes.io/projected/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-kube-api-access-p5vz6\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:02 crc kubenswrapper[4921]: I0318 12:34:02.819361 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10885b1e-9d73-4fcd-b0da-5e73707e9c6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.203033 4921 generic.go:334] "Generic (PLEG): container finished" podID="6c18cacd-e41a-4e03-ac32-0633e90d60c1" containerID="883b6fdd80905e59b68bd5edf1ef12d420de787a692ba232378312e1af55dfbd" exitCode=0 Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.203166 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" event={"ID":"6c18cacd-e41a-4e03-ac32-0633e90d60c1","Type":"ContainerDied","Data":"883b6fdd80905e59b68bd5edf1ef12d420de787a692ba232378312e1af55dfbd"} Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.207844 4921 generic.go:334] "Generic (PLEG): container finished" podID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerID="1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578" exitCode=0 Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.207888 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvj5" event={"ID":"10885b1e-9d73-4fcd-b0da-5e73707e9c6b","Type":"ContainerDied","Data":"1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578"} Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.207944 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvj5" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.207982 4921 scope.go:117] "RemoveContainer" containerID="1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.224448 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvj5" event={"ID":"10885b1e-9d73-4fcd-b0da-5e73707e9c6b","Type":"ContainerDied","Data":"c22956439338bfbb1f2a15db32aa0b56a330e7de695fd40b835cb1acc0bead11"} Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.234138 4921 scope.go:117] "RemoveContainer" containerID="bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.271399 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvj5"] Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.274333 4921 scope.go:117] "RemoveContainer" containerID="73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.292472 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvj5"] Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.323981 4921 scope.go:117] "RemoveContainer" containerID="1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578" Mar 18 12:34:03 crc kubenswrapper[4921]: E0318 12:34:03.324602 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578\": container with ID starting with 1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578 not found: ID does not exist" containerID="1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.324657 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578"} err="failed to get container status \"1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578\": rpc error: code = NotFound desc = could not find container \"1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578\": container with ID starting with 1bbb7af9a8f58e92fe56123eb5d1c3dfb40944ff3fe7bce55fe0292fbff95578 not found: ID does not exist" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.324693 4921 scope.go:117] "RemoveContainer" containerID="bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a" Mar 18 12:34:03 crc kubenswrapper[4921]: E0318 12:34:03.325020 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a\": container with ID starting with bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a not found: ID does not exist" containerID="bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.325121 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a"} err="failed to get container status \"bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a\": rpc error: code = NotFound desc = could not find container \"bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a\": container with ID starting with bc3ff6c3c425d779512e2f0f848bea177abe49b75e980ce84bdd75f44191ca9a not found: ID does not exist" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.325195 4921 scope.go:117] "RemoveContainer" containerID="73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2" Mar 18 12:34:03 crc kubenswrapper[4921]: E0318 12:34:03.325611 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2\": container with ID starting with 73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2 not found: ID does not exist" containerID="73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.325652 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2"} err="failed to get container status \"73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2\": rpc error: code = NotFound desc = could not find container \"73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2\": container with ID starting with 73363d710bca03bff6fcf6f36ee4013d9e3ec5975b457361364675ca34f3c8e2 not found: ID does not exist" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.396428 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 12:34:03 crc kubenswrapper[4921]: I0318 12:34:03.496858 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:34:04 crc kubenswrapper[4921]: I0318 12:34:04.999070 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.036261 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.051616 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqp4g\" (UniqueName: \"kubernetes.io/projected/6c18cacd-e41a-4e03-ac32-0633e90d60c1-kube-api-access-rqp4g\") pod \"6c18cacd-e41a-4e03-ac32-0633e90d60c1\" (UID: \"6c18cacd-e41a-4e03-ac32-0633e90d60c1\") " Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.058034 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c18cacd-e41a-4e03-ac32-0633e90d60c1-kube-api-access-rqp4g" (OuterVolumeSpecName: "kube-api-access-rqp4g") pod "6c18cacd-e41a-4e03-ac32-0633e90d60c1" (UID: "6c18cacd-e41a-4e03-ac32-0633e90d60c1"). InnerVolumeSpecName "kube-api-access-rqp4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.154129 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqp4g\" (UniqueName: \"kubernetes.io/projected/6c18cacd-e41a-4e03-ac32-0633e90d60c1-kube-api-access-rqp4g\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.222603 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" path="/var/lib/kubelet/pods/10885b1e-9d73-4fcd-b0da-5e73707e9c6b/volumes" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.677641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" event={"ID":"6c18cacd-e41a-4e03-ac32-0633e90d60c1","Type":"ContainerDied","Data":"3aae52b5c1bfb5acac9fb9d5ae1bf7242c076aa0ca1f238aab6364a4d2cddddd"} Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.677690 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aae52b5c1bfb5acac9fb9d5ae1bf7242c076aa0ca1f238aab6364a4d2cddddd" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.677713 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-2bkbb" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.774642 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:34:05 crc kubenswrapper[4921]: I0318 12:34:05.774973 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:34:06 crc kubenswrapper[4921]: I0318 12:34:06.116565 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-ffjpc"] Mar 18 12:34:06 crc kubenswrapper[4921]: I0318 12:34:06.126654 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-ffjpc"] Mar 18 12:34:06 crc kubenswrapper[4921]: I0318 12:34:06.793282 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:06 crc kubenswrapper[4921]: I0318 12:34:06.793291 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:07 crc kubenswrapper[4921]: I0318 12:34:07.228067 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08e8f99-4229-4ba3-979f-0a1cc8e3406f" path="/var/lib/kubelet/pods/e08e8f99-4229-4ba3-979f-0a1cc8e3406f/volumes" Mar 18 12:34:08 crc kubenswrapper[4921]: I0318 12:34:08.497432 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:34:08 crc kubenswrapper[4921]: I0318 12:34:08.529582 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:34:08 crc kubenswrapper[4921]: I0318 12:34:08.738400 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:34:09 crc kubenswrapper[4921]: I0318 12:34:09.527494 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:34:09 crc kubenswrapper[4921]: I0318 12:34:09.529080 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:34:09 crc kubenswrapper[4921]: I0318 12:34:09.645057 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:09 crc kubenswrapper[4921]: I0318 12:34:09.645305 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="45759e25-d3df-4741-bbc3-4111118d3d1e" containerName="kube-state-metrics" containerID="cri-o://0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f" gracePeriod=30 Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.208583 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.361917 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nlb6\" (UniqueName: \"kubernetes.io/projected/45759e25-d3df-4741-bbc3-4111118d3d1e-kube-api-access-7nlb6\") pod \"45759e25-d3df-4741-bbc3-4111118d3d1e\" (UID: \"45759e25-d3df-4741-bbc3-4111118d3d1e\") " Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.378340 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45759e25-d3df-4741-bbc3-4111118d3d1e-kube-api-access-7nlb6" (OuterVolumeSpecName: "kube-api-access-7nlb6") pod "45759e25-d3df-4741-bbc3-4111118d3d1e" (UID: "45759e25-d3df-4741-bbc3-4111118d3d1e"). InnerVolumeSpecName "kube-api-access-7nlb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.464370 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nlb6\" (UniqueName: \"kubernetes.io/projected/45759e25-d3df-4741-bbc3-4111118d3d1e-kube-api-access-7nlb6\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.609329 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.609357 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.723339 4921 generic.go:334] "Generic (PLEG): container finished" podID="45759e25-d3df-4741-bbc3-4111118d3d1e" containerID="0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f" exitCode=2 Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.723475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45759e25-d3df-4741-bbc3-4111118d3d1e","Type":"ContainerDied","Data":"0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f"} Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.723503 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.723788 4921 scope.go:117] "RemoveContainer" containerID="0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.723715 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"45759e25-d3df-4741-bbc3-4111118d3d1e","Type":"ContainerDied","Data":"93a405b0ea0f31e1007feb75e7bc5b4c579bb3ab82c2fcf35a5d34a003b6ba79"} Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.746383 4921 scope.go:117] "RemoveContainer" containerID="0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f" Mar 18 12:34:10 crc kubenswrapper[4921]: E0318 12:34:10.746828 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f\": container with ID starting with 0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f not found: ID does not exist" containerID="0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.746873 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f"} err="failed to get container status \"0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f\": rpc error: code = NotFound desc = could not find container \"0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f\": container with ID starting with 0dadf7dc1dfbdba998885adc4887794c6441011211690711a115a7fa3bcce05f not found: ID does not exist" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.755936 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.765098 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.774685 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:10 crc kubenswrapper[4921]: E0318 12:34:10.775120 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="registry-server" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775156 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="registry-server" Mar 18 12:34:10 crc kubenswrapper[4921]: E0318 12:34:10.775181 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="extract-utilities" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775189 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="extract-utilities" Mar 18 12:34:10 crc kubenswrapper[4921]: E0318 12:34:10.775213 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c18cacd-e41a-4e03-ac32-0633e90d60c1" containerName="oc" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775219 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c18cacd-e41a-4e03-ac32-0633e90d60c1" containerName="oc" Mar 18 12:34:10 crc kubenswrapper[4921]: E0318 12:34:10.775231 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="extract-content" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775237 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="extract-content" Mar 18 12:34:10 crc kubenswrapper[4921]: E0318 12:34:10.775247 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45759e25-d3df-4741-bbc3-4111118d3d1e" containerName="kube-state-metrics" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775253 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="45759e25-d3df-4741-bbc3-4111118d3d1e" containerName="kube-state-metrics" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775422 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="45759e25-d3df-4741-bbc3-4111118d3d1e" containerName="kube-state-metrics" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775447 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c18cacd-e41a-4e03-ac32-0633e90d60c1" containerName="oc" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.775458 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="10885b1e-9d73-4fcd-b0da-5e73707e9c6b" containerName="registry-server" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.776072 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.777697 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.778121 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.806504 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.973037 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.973100 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.973404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5blf\" (UniqueName: \"kubernetes.io/projected/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-api-access-t5blf\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:10 crc kubenswrapper[4921]: I0318 12:34:10.973565 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.075577 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.075667 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.075783 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5blf\" (UniqueName: \"kubernetes.io/projected/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-api-access-t5blf\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.075846 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.082381 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.087188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.092178 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.098645 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5blf\" (UniqueName: \"kubernetes.io/projected/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-api-access-t5blf\") pod \"kube-state-metrics-0\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.102644 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.232283 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45759e25-d3df-4741-bbc3-4111118d3d1e" path="/var/lib/kubelet/pods/45759e25-d3df-4741-bbc3-4111118d3d1e/volumes" Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.616852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.737272 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1414d026-b9f7-4fb9-ae37-0de669bf759f","Type":"ContainerStarted","Data":"58bd963f7c92608cc2e299bca05474fee38d53522984eac94e66beabdc1ea3d7"} Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.970846 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.971628 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-central-agent" containerID="cri-o://ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b" gracePeriod=30 Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.971794 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="proxy-httpd" containerID="cri-o://4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7" gracePeriod=30 Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.971846 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="sg-core" containerID="cri-o://b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a" gracePeriod=30 Mar 18 12:34:11 crc kubenswrapper[4921]: I0318 12:34:11.971895 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-notification-agent" containerID="cri-o://47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8" gracePeriod=30 Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.747938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1414d026-b9f7-4fb9-ae37-0de669bf759f","Type":"ContainerStarted","Data":"dfc603010f57b8628b4ea1ee256d5bb53992fc758a2b5b05d3543c3947d66b31"} Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.749691 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.753504 4921 generic.go:334] "Generic (PLEG): container finished" podID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerID="4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7" exitCode=0 Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.753544 4921 generic.go:334] "Generic (PLEG): container finished" podID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerID="b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a" exitCode=2 Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.753557 4921 generic.go:334] "Generic (PLEG): container finished" podID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerID="ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b" exitCode=0 Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.753592 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerDied","Data":"4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7"} Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.753663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerDied","Data":"b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a"} Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.753678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerDied","Data":"ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b"} Mar 18 12:34:12 crc kubenswrapper[4921]: I0318 12:34:12.778826 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.120732624 podStartE2EDuration="2.778804373s" podCreationTimestamp="2026-03-18 12:34:10 +0000 UTC" firstStartedPulling="2026-03-18 12:34:11.621293597 +0000 UTC m=+1471.171214236" lastFinishedPulling="2026-03-18 12:34:12.279365346 +0000 UTC m=+1471.829285985" observedRunningTime="2026-03-18 12:34:12.773385189 +0000 UTC m=+1472.323305828" watchObservedRunningTime="2026-03-18 12:34:12.778804373 +0000 UTC m=+1472.328725002" Mar 18 12:34:13 crc kubenswrapper[4921]: I0318 12:34:13.774229 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:34:13 crc kubenswrapper[4921]: I0318 12:34:13.774295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:34:15 crc kubenswrapper[4921]: I0318 12:34:15.780837 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:34:15 crc kubenswrapper[4921]: I0318 12:34:15.781254 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:34:15 crc kubenswrapper[4921]: I0318 12:34:15.793237 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:34:15 crc kubenswrapper[4921]: I0318 12:34:15.799025 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.321022 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.327345 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478436 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-config-data\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478506 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-run-httpd\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478594 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhbrp\" (UniqueName: \"kubernetes.io/projected/08118125-56d8-489c-83fb-d54c86aff1d4-kube-api-access-dhbrp\") pod \"08118125-56d8-489c-83fb-d54c86aff1d4\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478614 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-scripts\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478684 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-log-httpd\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478700 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-sg-core-conf-yaml\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478721 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsxtw\" (UniqueName: \"kubernetes.io/projected/d4ec8a1f-95dc-4169-aee6-899cbb6db594-kube-api-access-fsxtw\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478750 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-combined-ca-bundle\") pod \"08118125-56d8-489c-83fb-d54c86aff1d4\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478788 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-config-data\") pod \"08118125-56d8-489c-83fb-d54c86aff1d4\" (UID: \"08118125-56d8-489c-83fb-d54c86aff1d4\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.478810 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-combined-ca-bundle\") pod \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\" (UID: \"d4ec8a1f-95dc-4169-aee6-899cbb6db594\") " Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.480394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.485695 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.486006 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ec8a1f-95dc-4169-aee6-899cbb6db594-kube-api-access-fsxtw" (OuterVolumeSpecName: "kube-api-access-fsxtw") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "kube-api-access-fsxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.488927 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08118125-56d8-489c-83fb-d54c86aff1d4-kube-api-access-dhbrp" (OuterVolumeSpecName: "kube-api-access-dhbrp") pod "08118125-56d8-489c-83fb-d54c86aff1d4" (UID: "08118125-56d8-489c-83fb-d54c86aff1d4"). InnerVolumeSpecName "kube-api-access-dhbrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.492832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-scripts" (OuterVolumeSpecName: "scripts") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.510777 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-config-data" (OuterVolumeSpecName: "config-data") pod "08118125-56d8-489c-83fb-d54c86aff1d4" (UID: "08118125-56d8-489c-83fb-d54c86aff1d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.519209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08118125-56d8-489c-83fb-d54c86aff1d4" (UID: "08118125-56d8-489c-83fb-d54c86aff1d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.531206 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.579579 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580534 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580557 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhbrp\" (UniqueName: \"kubernetes.io/projected/08118125-56d8-489c-83fb-d54c86aff1d4-kube-api-access-dhbrp\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580567 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580575 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4ec8a1f-95dc-4169-aee6-899cbb6db594-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580584 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580601 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsxtw\" (UniqueName: \"kubernetes.io/projected/d4ec8a1f-95dc-4169-aee6-899cbb6db594-kube-api-access-fsxtw\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580615 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580625 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08118125-56d8-489c-83fb-d54c86aff1d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.580635 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.607182 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-config-data" (OuterVolumeSpecName: "config-data") pod "d4ec8a1f-95dc-4169-aee6-899cbb6db594" (UID: "d4ec8a1f-95dc-4169-aee6-899cbb6db594"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.682548 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ec8a1f-95dc-4169-aee6-899cbb6db594-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.798408 4921 generic.go:334] "Generic (PLEG): container finished" podID="08118125-56d8-489c-83fb-d54c86aff1d4" containerID="4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520" exitCode=137 Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.798500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08118125-56d8-489c-83fb-d54c86aff1d4","Type":"ContainerDied","Data":"4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520"} Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.798535 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"08118125-56d8-489c-83fb-d54c86aff1d4","Type":"ContainerDied","Data":"d07ff2b8cd067f8f06830c2babbe72745c24d0288f80ed4d73d3db1014a28d5d"} Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.798555 4921 scope.go:117] "RemoveContainer" containerID="4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.798703 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.807843 4921 generic.go:334] "Generic (PLEG): container finished" podID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerID="47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8" exitCode=0 Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.809043 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.818925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerDied","Data":"47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8"} Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.819203 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4ec8a1f-95dc-4169-aee6-899cbb6db594","Type":"ContainerDied","Data":"49f5145acc314267b8b50e1cb96b1b74ff1de97a13f7c5e4b609e994fbab4f20"} Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.882517 4921 scope.go:117] "RemoveContainer" containerID="4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520" Mar 18 12:34:16 crc kubenswrapper[4921]: E0318 12:34:16.894296 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520\": container with ID starting with 4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520 not found: ID does not exist" containerID="4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.894542 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520"} err="failed to get container status \"4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520\": rpc error: code = NotFound desc = could not find container \"4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520\": container with ID starting with 4f780681f37923c892c3ca2d86043b73372d7324c2e70feec6e684e8cbad4520 not found: ID does not exist" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.894650 4921 scope.go:117] "RemoveContainer" containerID="4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.913482 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.927494 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.944088 4921 scope.go:117] "RemoveContainer" containerID="b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.945988 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.967268 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.984130 4921 scope.go:117] "RemoveContainer" containerID="47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.985905 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:16 crc kubenswrapper[4921]: E0318 12:34:16.986331 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="sg-core" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986350 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="sg-core" Mar 18 12:34:16 crc kubenswrapper[4921]: E0318 12:34:16.986367 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-central-agent" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986375 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-central-agent" Mar 18 12:34:16 crc kubenswrapper[4921]: E0318 12:34:16.986397 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="proxy-httpd" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986403 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="proxy-httpd" Mar 18 12:34:16 crc kubenswrapper[4921]: E0318 12:34:16.986420 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-notification-agent" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986426 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-notification-agent" Mar 18 12:34:16 crc kubenswrapper[4921]: E0318 12:34:16.986439 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08118125-56d8-489c-83fb-d54c86aff1d4" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986445 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08118125-56d8-489c-83fb-d54c86aff1d4" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986607 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="08118125-56d8-489c-83fb-d54c86aff1d4" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986621 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-central-agent" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986635 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="sg-core" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986643 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="ceilometer-notification-agent" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.986654 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" containerName="proxy-httpd" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.988609 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.992060 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.992367 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.994552 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.997601 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:16 crc kubenswrapper[4921]: I0318 12:34:16.999412 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.005610 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.005807 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.005932 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.014512 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.016326 4921 scope.go:117] "RemoveContainer" containerID="ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.025382 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.046310 4921 scope.go:117] "RemoveContainer" containerID="4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7" Mar 18 12:34:17 crc kubenswrapper[4921]: E0318 12:34:17.048886 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7\": container with ID starting with 4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7 not found: ID does not exist" containerID="4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.048953 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7"} err="failed to get container status \"4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7\": rpc error: code = NotFound desc = could not find container \"4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7\": container with ID starting with 4f31c02032cbb40bb61a11abf3a503673f1c9ab35cc00a446af478977b506fd7 not found: ID does not exist" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.048993 4921 scope.go:117] "RemoveContainer" containerID="b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a" Mar 18 12:34:17 crc kubenswrapper[4921]: E0318 12:34:17.052176 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a\": container with ID starting with b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a not found: ID does not exist" containerID="b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.052540 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a"} err="failed to get container status \"b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a\": rpc error: code = NotFound desc = could not find container \"b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a\": container with ID starting with b79472a6db60c48a38bac0a1dd70001c21460278cd6c86fa8938bdd03c75c08a not found: ID does not exist" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.052851 4921 scope.go:117] "RemoveContainer" containerID="47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8" Mar 18 12:34:17 crc kubenswrapper[4921]: E0318 12:34:17.053348 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8\": container with ID starting with 47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8 not found: ID does not exist" containerID="47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.053404 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8"} err="failed to get container status \"47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8\": rpc error: code = NotFound desc = could not find container \"47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8\": container with ID starting with 47c51b2b31f03906099d325290756716b977ef3156085e2a5b299c9dbdc83bd8 not found: ID does not exist" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.053438 4921 scope.go:117] "RemoveContainer" containerID="ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b" Mar 18 12:34:17 crc kubenswrapper[4921]: E0318 12:34:17.053813 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b\": container with ID starting with ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b not found: ID does not exist" containerID="ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.053846 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b"} err="failed to get container status \"ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b\": rpc error: code = NotFound desc = could not find container \"ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b\": container with ID starting with ad0b89a8b5df68453b13b0821f6d1e998c56a3a9926a398f47e935bcd604a79b not found: ID does not exist" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099131 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-scripts\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099177 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-config-data\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099227 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099288 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-run-httpd\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099311 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-log-httpd\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099336 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdml\" (UniqueName: \"kubernetes.io/projected/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-kube-api-access-4mdml\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099381 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099460 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099480 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099508 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.099544 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbm2b\" (UniqueName: \"kubernetes.io/projected/b032b317-787f-4f39-bf12-aff187fb862f-kube-api-access-cbm2b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201149 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-scripts\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201208 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-config-data\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201250 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-run-httpd\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201326 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-log-httpd\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdml\" (UniqueName: \"kubernetes.io/projected/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-kube-api-access-4mdml\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201402 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201439 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.201463 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.202190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-run-httpd\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.202327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-log-httpd\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.203670 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.203721 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.203771 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.203853 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbm2b\" (UniqueName: \"kubernetes.io/projected/b032b317-787f-4f39-bf12-aff187fb862f-kube-api-access-cbm2b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.207443 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.208045 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-scripts\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.208973 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.209171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.210023 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-config-data\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.210229 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.210639 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.212011 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.216621 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.220570 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08118125-56d8-489c-83fb-d54c86aff1d4" path="/var/lib/kubelet/pods/08118125-56d8-489c-83fb-d54c86aff1d4/volumes" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.221380 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ec8a1f-95dc-4169-aee6-899cbb6db594" path="/var/lib/kubelet/pods/d4ec8a1f-95dc-4169-aee6-899cbb6db594/volumes" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.225132 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdml\" (UniqueName: \"kubernetes.io/projected/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-kube-api-access-4mdml\") pod \"ceilometer-0\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.226530 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbm2b\" (UniqueName: \"kubernetes.io/projected/b032b317-787f-4f39-bf12-aff187fb862f-kube-api-access-cbm2b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.321837 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.335634 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.529976 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.530928 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.810213 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:17 crc kubenswrapper[4921]: W0318 12:34:17.819392 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d111ef2_f8e9_472c_b0f6_3dcd631d789d.slice/crio-b8c8f3380207e3cb99799b4b0bb2fe54e7b62ca10a8d279a518da4e68289a695 WatchSource:0}: Error finding container b8c8f3380207e3cb99799b4b0bb2fe54e7b62ca10a8d279a518da4e68289a695: Status 404 returned error can't find the container with id b8c8f3380207e3cb99799b4b0bb2fe54e7b62ca10a8d279a518da4e68289a695 Mar 18 12:34:17 crc kubenswrapper[4921]: I0318 12:34:17.826981 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:34:17 crc kubenswrapper[4921]: W0318 12:34:17.834174 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb032b317_787f_4f39_bf12_aff187fb862f.slice/crio-0500708b258ca4763f923aea09cf242fd64c6ed3598052fb75fcb177f3402a1f WatchSource:0}: Error finding container 0500708b258ca4763f923aea09cf242fd64c6ed3598052fb75fcb177f3402a1f: Status 404 returned error can't find the container with id 0500708b258ca4763f923aea09cf242fd64c6ed3598052fb75fcb177f3402a1f Mar 18 12:34:18 crc kubenswrapper[4921]: I0318 12:34:18.839727 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerStarted","Data":"4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014"} Mar 18 12:34:18 crc kubenswrapper[4921]: I0318 12:34:18.840327 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerStarted","Data":"b8c8f3380207e3cb99799b4b0bb2fe54e7b62ca10a8d279a518da4e68289a695"} Mar 18 12:34:18 crc kubenswrapper[4921]: I0318 12:34:18.841949 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b032b317-787f-4f39-bf12-aff187fb862f","Type":"ContainerStarted","Data":"f0d4324420151331450e6d095a9a2716913bafe1a8e2ec54e2ee054bd54bde23"} Mar 18 12:34:18 crc kubenswrapper[4921]: I0318 12:34:18.841992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b032b317-787f-4f39-bf12-aff187fb862f","Type":"ContainerStarted","Data":"0500708b258ca4763f923aea09cf242fd64c6ed3598052fb75fcb177f3402a1f"} Mar 18 12:34:18 crc kubenswrapper[4921]: I0318 12:34:18.869952 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.869931906 podStartE2EDuration="2.869931906s" podCreationTimestamp="2026-03-18 12:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:18.863696489 +0000 UTC m=+1478.413617128" watchObservedRunningTime="2026-03-18 12:34:18.869931906 +0000 UTC m=+1478.419852545" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.538934 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.539389 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.545083 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.549534 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.771956 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-fbcg4"] Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.773476 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.795439 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-fbcg4"] Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.869991 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerStarted","Data":"300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a"} Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.876865 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.876907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-config\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.876938 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstr2\" (UniqueName: \"kubernetes.io/projected/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-kube-api-access-jstr2\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.877001 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.877033 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.877060 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.978836 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.979950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-config\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.979887 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.980053 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstr2\" (UniqueName: \"kubernetes.io/projected/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-kube-api-access-jstr2\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.980640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.980754 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-config\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.981325 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.981430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.981866 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.982954 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:19 crc kubenswrapper[4921]: I0318 12:34:19.983239 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:20 crc kubenswrapper[4921]: I0318 12:34:20.003849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstr2\" (UniqueName: \"kubernetes.io/projected/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-kube-api-access-jstr2\") pod \"dnsmasq-dns-89c5cd4d5-fbcg4\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:20 crc kubenswrapper[4921]: I0318 12:34:20.265651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:20 crc kubenswrapper[4921]: W0318 12:34:20.774067 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ca2e8c_3d98_4448_afa4_8a278b5f0c54.slice/crio-1ed230a204f4e56c3255eae9face8bf1b5eab134a84132b30b3f8b192561fd0a WatchSource:0}: Error finding container 1ed230a204f4e56c3255eae9face8bf1b5eab134a84132b30b3f8b192561fd0a: Status 404 returned error can't find the container with id 1ed230a204f4e56c3255eae9face8bf1b5eab134a84132b30b3f8b192561fd0a Mar 18 12:34:20 crc kubenswrapper[4921]: I0318 12:34:20.778231 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-fbcg4"] Mar 18 12:34:20 crc kubenswrapper[4921]: I0318 12:34:20.882834 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerStarted","Data":"54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b"} Mar 18 12:34:20 crc kubenswrapper[4921]: I0318 12:34:20.886780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" event={"ID":"66ca2e8c-3d98-4448-afa4-8a278b5f0c54","Type":"ContainerStarted","Data":"1ed230a204f4e56c3255eae9face8bf1b5eab134a84132b30b3f8b192561fd0a"} Mar 18 12:34:21 crc kubenswrapper[4921]: I0318 12:34:21.115486 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 12:34:21 crc kubenswrapper[4921]: I0318 12:34:21.896892 4921 generic.go:334] "Generic (PLEG): container finished" podID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerID="da6d212c129e736460b40210e84be01f919a959e7de91a56b27199b3820ed71d" exitCode=0 Mar 18 12:34:21 crc kubenswrapper[4921]: I0318 12:34:21.897169 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" event={"ID":"66ca2e8c-3d98-4448-afa4-8a278b5f0c54","Type":"ContainerDied","Data":"da6d212c129e736460b40210e84be01f919a959e7de91a56b27199b3820ed71d"} Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.336986 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.854910 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.855182 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-log" containerID="cri-o://bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b" gracePeriod=30 Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.855345 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-api" containerID="cri-o://5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa" gracePeriod=30 Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.922432 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerStarted","Data":"e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db"} Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.922572 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.928409 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" event={"ID":"66ca2e8c-3d98-4448-afa4-8a278b5f0c54","Type":"ContainerStarted","Data":"696963e76f2c9834064e1cc263a38d695183d851da7f00d1ff0c9af0d196e352"} Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.928525 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.954125 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.85426074 podStartE2EDuration="6.954079292s" podCreationTimestamp="2026-03-18 12:34:16 +0000 UTC" firstStartedPulling="2026-03-18 12:34:17.827488816 +0000 UTC m=+1477.377409455" lastFinishedPulling="2026-03-18 12:34:21.927307368 +0000 UTC m=+1481.477228007" observedRunningTime="2026-03-18 12:34:22.943396979 +0000 UTC m=+1482.493317638" watchObservedRunningTime="2026-03-18 12:34:22.954079292 +0000 UTC m=+1482.503999931" Mar 18 12:34:22 crc kubenswrapper[4921]: I0318 12:34:22.977563 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" podStartSLOduration=3.977542958 podStartE2EDuration="3.977542958s" podCreationTimestamp="2026-03-18 12:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:22.976414676 +0000 UTC m=+1482.526335315" watchObservedRunningTime="2026-03-18 12:34:22.977542958 +0000 UTC m=+1482.527463597" Mar 18 12:34:23 crc kubenswrapper[4921]: I0318 12:34:23.377441 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:23 crc kubenswrapper[4921]: I0318 12:34:23.939566 4921 generic.go:334] "Generic (PLEG): container finished" podID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerID="bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b" exitCode=143 Mar 18 12:34:23 crc kubenswrapper[4921]: I0318 12:34:23.939794 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4644c8-fadc-48dd-affb-ea89a4ead9ca","Type":"ContainerDied","Data":"bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b"} Mar 18 12:34:24 crc kubenswrapper[4921]: I0318 12:34:24.948324 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-central-agent" containerID="cri-o://4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014" gracePeriod=30 Mar 18 12:34:24 crc kubenswrapper[4921]: I0318 12:34:24.948905 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="proxy-httpd" containerID="cri-o://e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db" gracePeriod=30 Mar 18 12:34:24 crc kubenswrapper[4921]: I0318 12:34:24.948967 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="sg-core" containerID="cri-o://54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b" gracePeriod=30 Mar 18 12:34:24 crc kubenswrapper[4921]: I0318 12:34:24.949013 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-notification-agent" containerID="cri-o://300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a" gracePeriod=30 Mar 18 12:34:25 crc kubenswrapper[4921]: I0318 12:34:25.960992 4921 generic.go:334] "Generic (PLEG): container finished" podID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerID="e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db" exitCode=0 Mar 18 12:34:25 crc kubenswrapper[4921]: I0318 12:34:25.961333 4921 generic.go:334] "Generic (PLEG): container finished" podID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerID="54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b" exitCode=2 Mar 18 12:34:25 crc kubenswrapper[4921]: I0318 12:34:25.961348 4921 generic.go:334] "Generic (PLEG): container finished" podID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerID="300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a" exitCode=0 Mar 18 12:34:25 crc kubenswrapper[4921]: I0318 12:34:25.961365 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerDied","Data":"e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db"} Mar 18 12:34:25 crc kubenswrapper[4921]: I0318 12:34:25.961390 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerDied","Data":"54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b"} Mar 18 12:34:25 crc kubenswrapper[4921]: I0318 12:34:25.961399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerDied","Data":"300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a"} Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.435892 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.468271 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-config-data\") pod \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.468340 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-combined-ca-bundle\") pod \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.468428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-logs\") pod \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.468467 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5vm\" (UniqueName: \"kubernetes.io/projected/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-kube-api-access-mx5vm\") pod \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\" (UID: \"ba4644c8-fadc-48dd-affb-ea89a4ead9ca\") " Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.469544 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-logs" (OuterVolumeSpecName: "logs") pod "ba4644c8-fadc-48dd-affb-ea89a4ead9ca" (UID: "ba4644c8-fadc-48dd-affb-ea89a4ead9ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.474576 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-kube-api-access-mx5vm" (OuterVolumeSpecName: "kube-api-access-mx5vm") pod "ba4644c8-fadc-48dd-affb-ea89a4ead9ca" (UID: "ba4644c8-fadc-48dd-affb-ea89a4ead9ca"). InnerVolumeSpecName "kube-api-access-mx5vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.496770 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-config-data" (OuterVolumeSpecName: "config-data") pod "ba4644c8-fadc-48dd-affb-ea89a4ead9ca" (UID: "ba4644c8-fadc-48dd-affb-ea89a4ead9ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.504393 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba4644c8-fadc-48dd-affb-ea89a4ead9ca" (UID: "ba4644c8-fadc-48dd-affb-ea89a4ead9ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.569883 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.569913 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.569926 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.569937 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5vm\" (UniqueName: \"kubernetes.io/projected/ba4644c8-fadc-48dd-affb-ea89a4ead9ca-kube-api-access-mx5vm\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.976433 4921 generic.go:334] "Generic (PLEG): container finished" podID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerID="5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa" exitCode=0 Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.976830 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.977273 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4644c8-fadc-48dd-affb-ea89a4ead9ca","Type":"ContainerDied","Data":"5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa"} Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.977307 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4644c8-fadc-48dd-affb-ea89a4ead9ca","Type":"ContainerDied","Data":"b198f2358b908b5cec69ae580dac0bd58222fec62cf358f3b18029df1e40cb9c"} Mar 18 12:34:26 crc kubenswrapper[4921]: I0318 12:34:26.977329 4921 scope.go:117] "RemoveContainer" containerID="5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.014776 4921 scope.go:117] "RemoveContainer" containerID="bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.022574 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.039651 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.048485 4921 scope.go:117] "RemoveContainer" containerID="5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa" Mar 18 12:34:27 crc kubenswrapper[4921]: E0318 12:34:27.049738 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa\": container with ID starting with 5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa not found: ID does not exist" containerID="5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.049773 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa"} err="failed to get container status \"5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa\": rpc error: code = NotFound desc = could not find container \"5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa\": container with ID starting with 5c2d6461ee2d0c93e3870c1128ea83fa6b49a5ffbc008a9e8be936a80df844aa not found: ID does not exist" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.049792 4921 scope.go:117] "RemoveContainer" containerID="bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b" Mar 18 12:34:27 crc kubenswrapper[4921]: E0318 12:34:27.050094 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b\": container with ID starting with bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b not found: ID does not exist" containerID="bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.050135 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b"} err="failed to get container status \"bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b\": rpc error: code = NotFound desc = could not find container \"bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b\": container with ID starting with bd6f8f73829ba19bbaf06b7df26a15dd487f8ef70feb8fd19f5595fcb4af570b not found: ID does not exist" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.055243 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:27 crc kubenswrapper[4921]: E0318 12:34:27.055753 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-log" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.055775 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-log" Mar 18 12:34:27 crc kubenswrapper[4921]: E0318 12:34:27.055810 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-api" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.055819 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-api" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.056026 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-api" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.056059 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" containerName="nova-api-log" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.057324 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.074480 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.074798 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.074954 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.079424 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.079469 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-config-data\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.079556 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx9z2\" (UniqueName: \"kubernetes.io/projected/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-kube-api-access-lx9z2\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.079591 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-logs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.080083 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.080141 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.082319 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.181331 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx9z2\" (UniqueName: \"kubernetes.io/projected/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-kube-api-access-lx9z2\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.181572 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-logs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.181639 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.181671 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.181721 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.181737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-config-data\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.183582 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-logs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.186627 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-config-data\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.186762 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.188248 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.192915 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.201440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx9z2\" (UniqueName: \"kubernetes.io/projected/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-kube-api-access-lx9z2\") pod \"nova-api-0\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.234245 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4644c8-fadc-48dd-affb-ea89a4ead9ca" path="/var/lib/kubelet/pods/ba4644c8-fadc-48dd-affb-ea89a4ead9ca/volumes" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.337212 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.373091 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.428076 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.468685 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.589255 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-run-httpd\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.589646 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-config-data\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.589718 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.589789 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-ceilometer-tls-certs\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.589820 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdml\" (UniqueName: \"kubernetes.io/projected/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-kube-api-access-4mdml\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.589959 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-sg-core-conf-yaml\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.590023 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-log-httpd\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.590067 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-scripts\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.590233 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-combined-ca-bundle\") pod \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\" (UID: \"9d111ef2-f8e9-472c-b0f6-3dcd631d789d\") " Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.591399 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.595311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.597628 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-kube-api-access-4mdml" (OuterVolumeSpecName: "kube-api-access-4mdml") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "kube-api-access-4mdml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.597921 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-scripts" (OuterVolumeSpecName: "scripts") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.628931 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.658217 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.684974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.692485 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.692522 4921 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.692536 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mdml\" (UniqueName: \"kubernetes.io/projected/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-kube-api-access-4mdml\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.692549 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.692563 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.692574 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.716938 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-config-data" (OuterVolumeSpecName: "config-data") pod "9d111ef2-f8e9-472c-b0f6-3dcd631d789d" (UID: "9d111ef2-f8e9-472c-b0f6-3dcd631d789d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.794337 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d111ef2-f8e9-472c-b0f6-3dcd631d789d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.969976 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:27 crc kubenswrapper[4921]: W0318 12:34:27.977711 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5d3cd82_b9b2_42ab_be5e_949a84dd58b1.slice/crio-6a7438629307d864fea8327d79f3d81169708cccd4d8506d26cce5949e615b94 WatchSource:0}: Error finding container 6a7438629307d864fea8327d79f3d81169708cccd4d8506d26cce5949e615b94: Status 404 returned error can't find the container with id 6a7438629307d864fea8327d79f3d81169708cccd4d8506d26cce5949e615b94 Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.992316 4921 generic.go:334] "Generic (PLEG): container finished" podID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerID="4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014" exitCode=0 Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.992599 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.993576 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerDied","Data":"4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014"} Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.993608 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d111ef2-f8e9-472c-b0f6-3dcd631d789d","Type":"ContainerDied","Data":"b8c8f3380207e3cb99799b4b0bb2fe54e7b62ca10a8d279a518da4e68289a695"} Mar 18 12:34:27 crc kubenswrapper[4921]: I0318 12:34:27.993626 4921 scope.go:117] "RemoveContainer" containerID="e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.011283 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.022713 4921 scope.go:117] "RemoveContainer" containerID="54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.045184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.059173 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.065958 4921 scope.go:117] "RemoveContainer" containerID="300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.083831 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.084391 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="sg-core" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084418 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="sg-core" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.084439 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-notification-agent" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084448 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-notification-agent" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.084474 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="proxy-httpd" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084482 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="proxy-httpd" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.084512 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-central-agent" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084521 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-central-agent" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084756 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-central-agent" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084789 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="ceilometer-notification-agent" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084803 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="proxy-httpd" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.084815 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" containerName="sg-core" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.087316 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.092179 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.092194 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.096234 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.098938 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-config-data\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.098973 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.099010 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.099024 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-log-httpd\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.099043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.099063 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-run-httpd\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.099126 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkmp\" (UniqueName: \"kubernetes.io/projected/4c22e952-1a8a-4998-bcc4-72114cb84c82-kube-api-access-blkmp\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.099175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-scripts\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.111805 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.153270 4921 scope.go:117] "RemoveContainer" containerID="4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.178319 4921 scope.go:117] "RemoveContainer" containerID="e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.179317 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db\": container with ID starting with e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db not found: ID does not exist" containerID="e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.179351 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db"} err="failed to get container status \"e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db\": rpc error: code = NotFound desc = could not find container \"e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db\": container with ID starting with e32b89c3281b58b2b46d0adf10c6f9eae0306bea9f5b481056ab74585282f1db not found: ID does not exist" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.179374 4921 scope.go:117] "RemoveContainer" containerID="54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.179740 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b\": container with ID starting with 54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b not found: ID does not exist" containerID="54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.179759 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b"} err="failed to get container status \"54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b\": rpc error: code = NotFound desc = could not find container \"54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b\": container with ID starting with 54192fc5c3f55b1959b515587775abf9c41792cd0da51662cc69e4285a6c5c8b not found: ID does not exist" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.179771 4921 scope.go:117] "RemoveContainer" containerID="300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.179982 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a\": container with ID starting with 300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a not found: ID does not exist" containerID="300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.179997 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a"} err="failed to get container status \"300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a\": rpc error: code = NotFound desc = could not find container \"300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a\": container with ID starting with 300d600062d0653f5c885890e9562bfacd8a3cdf2ea11c5b7cafffa37951ce8a not found: ID does not exist" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.180010 4921 scope.go:117] "RemoveContainer" containerID="4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014" Mar 18 12:34:28 crc kubenswrapper[4921]: E0318 12:34:28.180601 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014\": container with ID starting with 4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014 not found: ID does not exist" containerID="4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.180617 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014"} err="failed to get container status \"4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014\": rpc error: code = NotFound desc = could not find container \"4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014\": container with ID starting with 4e15afed42455e841f437e513b7415fc394ba9521daeb3ce4690638c3bcd0014 not found: ID does not exist" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-config-data\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200334 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200374 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200390 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-log-httpd\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200411 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200432 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-run-httpd\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200451 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkmp\" (UniqueName: \"kubernetes.io/projected/4c22e952-1a8a-4998-bcc4-72114cb84c82-kube-api-access-blkmp\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.200502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-scripts\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.201804 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-log-httpd\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.202000 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-run-httpd\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.205905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.205932 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.207707 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.208228 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-config-data\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.211727 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-scripts\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.219741 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkmp\" (UniqueName: \"kubernetes.io/projected/4c22e952-1a8a-4998-bcc4-72114cb84c82-kube-api-access-blkmp\") pod \"ceilometer-0\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.354070 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-kbkfz"] Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.355529 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.358410 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.358589 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.384916 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbkfz"] Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.475074 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.511512 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.511663 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-config-data\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.511738 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-scripts\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.511805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksfm\" (UniqueName: \"kubernetes.io/projected/1d494dc6-a889-426d-964e-b168ba155763-kube-api-access-2ksfm\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.613098 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.613205 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-config-data\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.613261 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-scripts\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.613315 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksfm\" (UniqueName: \"kubernetes.io/projected/1d494dc6-a889-426d-964e-b168ba155763-kube-api-access-2ksfm\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.622235 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.622382 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-config-data\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.623866 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-scripts\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.631784 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksfm\" (UniqueName: \"kubernetes.io/projected/1d494dc6-a889-426d-964e-b168ba155763-kube-api-access-2ksfm\") pod \"nova-cell1-cell-mapping-kbkfz\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.725966 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:28 crc kubenswrapper[4921]: I0318 12:34:28.864633 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:28 crc kubenswrapper[4921]: W0318 12:34:28.905319 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c22e952_1a8a_4998_bcc4_72114cb84c82.slice/crio-20f820eb54df8a34d7de96aef11035059e28341a1f438ef8bd31c62bec5925ef WatchSource:0}: Error finding container 20f820eb54df8a34d7de96aef11035059e28341a1f438ef8bd31c62bec5925ef: Status 404 returned error can't find the container with id 20f820eb54df8a34d7de96aef11035059e28341a1f438ef8bd31c62bec5925ef Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.006873 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerStarted","Data":"20f820eb54df8a34d7de96aef11035059e28341a1f438ef8bd31c62bec5925ef"} Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.009436 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1","Type":"ContainerStarted","Data":"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004"} Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.009484 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1","Type":"ContainerStarted","Data":"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175"} Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.009498 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1","Type":"ContainerStarted","Data":"6a7438629307d864fea8327d79f3d81169708cccd4d8506d26cce5949e615b94"} Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.038870 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.038847585 podStartE2EDuration="2.038847585s" podCreationTimestamp="2026-03-18 12:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:29.033325518 +0000 UTC m=+1488.583246167" watchObservedRunningTime="2026-03-18 12:34:29.038847585 +0000 UTC m=+1488.588768234" Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.245772 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d111ef2-f8e9-472c-b0f6-3dcd631d789d" path="/var/lib/kubelet/pods/9d111ef2-f8e9-472c-b0f6-3dcd631d789d/volumes" Mar 18 12:34:29 crc kubenswrapper[4921]: I0318 12:34:29.366916 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbkfz"] Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.022990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbkfz" event={"ID":"1d494dc6-a889-426d-964e-b168ba155763","Type":"ContainerStarted","Data":"0bcc716a8038376b1b7b1bc7e7ffe225d7b8b38b155200403e773bbe8b8c3326"} Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.023413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbkfz" event={"ID":"1d494dc6-a889-426d-964e-b168ba155763","Type":"ContainerStarted","Data":"755322700ef4fee55e907166c59224059836aa1927e127178f4cab3c014ee520"} Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.028944 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerStarted","Data":"a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510"} Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.051716 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-kbkfz" podStartSLOduration=2.051691734 podStartE2EDuration="2.051691734s" podCreationTimestamp="2026-03-18 12:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:30.04062274 +0000 UTC m=+1489.590543389" watchObservedRunningTime="2026-03-18 12:34:30.051691734 +0000 UTC m=+1489.601612373" Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.267501 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.331075 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hfntw"] Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.331972 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="dnsmasq-dns" containerID="cri-o://b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5" gracePeriod=10 Mar 18 12:34:30 crc kubenswrapper[4921]: I0318 12:34:30.903702 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.052761 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerStarted","Data":"ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930"} Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.057002 4921 generic.go:334] "Generic (PLEG): container finished" podID="71615a6c-de34-4a90-a680-f916d9813518" containerID="b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5" exitCode=0 Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.057090 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" event={"ID":"71615a6c-de34-4a90-a680-f916d9813518","Type":"ContainerDied","Data":"b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5"} Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.057143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" event={"ID":"71615a6c-de34-4a90-a680-f916d9813518","Type":"ContainerDied","Data":"560c54b560ee8c27332e6c76d407af2af64008fe9289d66db6e7d03618753249"} Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.057163 4921 scope.go:117] "RemoveContainer" containerID="b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.057160 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.076638 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-nb\") pod \"71615a6c-de34-4a90-a680-f916d9813518\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.076731 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-sb\") pod \"71615a6c-de34-4a90-a680-f916d9813518\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.076785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-swift-storage-0\") pod \"71615a6c-de34-4a90-a680-f916d9813518\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.076846 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vglnt\" (UniqueName: \"kubernetes.io/projected/71615a6c-de34-4a90-a680-f916d9813518-kube-api-access-vglnt\") pod \"71615a6c-de34-4a90-a680-f916d9813518\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.077022 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-svc\") pod \"71615a6c-de34-4a90-a680-f916d9813518\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.077043 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-config\") pod \"71615a6c-de34-4a90-a680-f916d9813518\" (UID: \"71615a6c-de34-4a90-a680-f916d9813518\") " Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.082470 4921 scope.go:117] "RemoveContainer" containerID="1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.094136 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71615a6c-de34-4a90-a680-f916d9813518-kube-api-access-vglnt" (OuterVolumeSpecName: "kube-api-access-vglnt") pod "71615a6c-de34-4a90-a680-f916d9813518" (UID: "71615a6c-de34-4a90-a680-f916d9813518"). InnerVolumeSpecName "kube-api-access-vglnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.180438 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vglnt\" (UniqueName: \"kubernetes.io/projected/71615a6c-de34-4a90-a680-f916d9813518-kube-api-access-vglnt\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.223803 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71615a6c-de34-4a90-a680-f916d9813518" (UID: "71615a6c-de34-4a90-a680-f916d9813518"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.257684 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71615a6c-de34-4a90-a680-f916d9813518" (UID: "71615a6c-de34-4a90-a680-f916d9813518"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.260920 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71615a6c-de34-4a90-a680-f916d9813518" (UID: "71615a6c-de34-4a90-a680-f916d9813518"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.261098 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-config" (OuterVolumeSpecName: "config") pod "71615a6c-de34-4a90-a680-f916d9813518" (UID: "71615a6c-de34-4a90-a680-f916d9813518"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.261651 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71615a6c-de34-4a90-a680-f916d9813518" (UID: "71615a6c-de34-4a90-a680-f916d9813518"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.282036 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.282068 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.282077 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.282087 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.282098 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71615a6c-de34-4a90-a680-f916d9813518-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.329794 4921 scope.go:117] "RemoveContainer" containerID="b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5" Mar 18 12:34:31 crc kubenswrapper[4921]: E0318 12:34:31.340882 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5\": container with ID starting with b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5 not found: ID does not exist" containerID="b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.340937 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5"} err="failed to get container status \"b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5\": rpc error: code = NotFound desc = could not find container \"b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5\": container with ID starting with b0a612319a96ae95e5d3a038a7732120ecf1bfb01c704f2f434423297c526ea5 not found: ID does not exist" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.340969 4921 scope.go:117] "RemoveContainer" containerID="1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d" Mar 18 12:34:31 crc kubenswrapper[4921]: E0318 12:34:31.341300 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d\": container with ID starting with 1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d not found: ID does not exist" containerID="1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.341329 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d"} err="failed to get container status \"1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d\": rpc error: code = NotFound desc = could not find container \"1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d\": container with ID starting with 1f647ce00f267d6e7b4549c26fe7fc71cc126065f9e1ff03119d53e4a861980d not found: ID does not exist" Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.402448 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hfntw"] Mar 18 12:34:31 crc kubenswrapper[4921]: I0318 12:34:31.412930 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hfntw"] Mar 18 12:34:32 crc kubenswrapper[4921]: I0318 12:34:32.069980 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerStarted","Data":"61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033"} Mar 18 12:34:33 crc kubenswrapper[4921]: I0318 12:34:33.221607 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71615a6c-de34-4a90-a680-f916d9813518" path="/var/lib/kubelet/pods/71615a6c-de34-4a90-a680-f916d9813518/volumes" Mar 18 12:34:34 crc kubenswrapper[4921]: I0318 12:34:34.095235 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerStarted","Data":"cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c"} Mar 18 12:34:34 crc kubenswrapper[4921]: I0318 12:34:34.096015 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:34:34 crc kubenswrapper[4921]: I0318 12:34:34.154008 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.049605186 podStartE2EDuration="6.15236398s" podCreationTimestamp="2026-03-18 12:34:28 +0000 UTC" firstStartedPulling="2026-03-18 12:34:28.951888487 +0000 UTC m=+1488.501809136" lastFinishedPulling="2026-03-18 12:34:33.054647291 +0000 UTC m=+1492.604567930" observedRunningTime="2026-03-18 12:34:34.138387923 +0000 UTC m=+1493.688308572" watchObservedRunningTime="2026-03-18 12:34:34.15236398 +0000 UTC m=+1493.702284619" Mar 18 12:34:35 crc kubenswrapper[4921]: I0318 12:34:35.154265 4921 generic.go:334] "Generic (PLEG): container finished" podID="1d494dc6-a889-426d-964e-b168ba155763" containerID="0bcc716a8038376b1b7b1bc7e7ffe225d7b8b38b155200403e773bbe8b8c3326" exitCode=0 Mar 18 12:34:35 crc kubenswrapper[4921]: I0318 12:34:35.154531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbkfz" event={"ID":"1d494dc6-a889-426d-964e-b168ba155763","Type":"ContainerDied","Data":"0bcc716a8038376b1b7b1bc7e7ffe225d7b8b38b155200403e773bbe8b8c3326"} Mar 18 12:34:35 crc kubenswrapper[4921]: I0318 12:34:35.660380 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-hfntw" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.191:5353: i/o timeout" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.551566 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.692793 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksfm\" (UniqueName: \"kubernetes.io/projected/1d494dc6-a889-426d-964e-b168ba155763-kube-api-access-2ksfm\") pod \"1d494dc6-a889-426d-964e-b168ba155763\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.694131 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-scripts\") pod \"1d494dc6-a889-426d-964e-b168ba155763\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.694219 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-combined-ca-bundle\") pod \"1d494dc6-a889-426d-964e-b168ba155763\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.694289 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-config-data\") pod \"1d494dc6-a889-426d-964e-b168ba155763\" (UID: \"1d494dc6-a889-426d-964e-b168ba155763\") " Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.700443 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d494dc6-a889-426d-964e-b168ba155763-kube-api-access-2ksfm" (OuterVolumeSpecName: "kube-api-access-2ksfm") pod "1d494dc6-a889-426d-964e-b168ba155763" (UID: "1d494dc6-a889-426d-964e-b168ba155763"). InnerVolumeSpecName "kube-api-access-2ksfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.707173 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-scripts" (OuterVolumeSpecName: "scripts") pod "1d494dc6-a889-426d-964e-b168ba155763" (UID: "1d494dc6-a889-426d-964e-b168ba155763"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.722184 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d494dc6-a889-426d-964e-b168ba155763" (UID: "1d494dc6-a889-426d-964e-b168ba155763"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.724816 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-config-data" (OuterVolumeSpecName: "config-data") pod "1d494dc6-a889-426d-964e-b168ba155763" (UID: "1d494dc6-a889-426d-964e-b168ba155763"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.797372 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ksfm\" (UniqueName: \"kubernetes.io/projected/1d494dc6-a889-426d-964e-b168ba155763-kube-api-access-2ksfm\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.797414 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.797424 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:36 crc kubenswrapper[4921]: I0318 12:34:36.797434 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d494dc6-a889-426d-964e-b168ba155763-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.175782 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbkfz" event={"ID":"1d494dc6-a889-426d-964e-b168ba155763","Type":"ContainerDied","Data":"755322700ef4fee55e907166c59224059836aa1927e127178f4cab3c014ee520"} Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.175829 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="755322700ef4fee55e907166c59224059836aa1927e127178f4cab3c014ee520" Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.175835 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbkfz" Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.377844 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.378216 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-api" containerID="cri-o://45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004" gracePeriod=30 Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.378192 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-log" containerID="cri-o://3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175" gracePeriod=30 Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.395398 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.395644 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="73d4aa03-5839-43ea-803e-64d12b544e1e" containerName="nova-scheduler-scheduler" containerID="cri-o://45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4" gracePeriod=30 Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.417693 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.418072 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-metadata" containerID="cri-o://31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732" gracePeriod=30 Mar 18 12:34:37 crc kubenswrapper[4921]: I0318 12:34:37.417996 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-log" containerID="cri-o://560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4" gracePeriod=30 Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.040023 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.119837 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-combined-ca-bundle\") pod \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.119995 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-config-data\") pod \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.120054 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx9z2\" (UniqueName: \"kubernetes.io/projected/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-kube-api-access-lx9z2\") pod \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.120099 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-public-tls-certs\") pod \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.120168 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-internal-tls-certs\") pod \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.120220 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-logs\") pod \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\" (UID: \"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1\") " Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.120896 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-logs" (OuterVolumeSpecName: "logs") pod "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" (UID: "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.126948 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-kube-api-access-lx9z2" (OuterVolumeSpecName: "kube-api-access-lx9z2") pod "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" (UID: "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1"). InnerVolumeSpecName "kube-api-access-lx9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.156261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-config-data" (OuterVolumeSpecName: "config-data") pod "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" (UID: "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.158978 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" (UID: "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188708 4921 generic.go:334] "Generic (PLEG): container finished" podID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerID="45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004" exitCode=0 Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188734 4921 generic.go:334] "Generic (PLEG): container finished" podID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerID="3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175" exitCode=143 Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188772 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1","Type":"ContainerDied","Data":"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004"} Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188867 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1","Type":"ContainerDied","Data":"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175"} Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5d3cd82-b9b2-42ab-be5e-949a84dd58b1","Type":"ContainerDied","Data":"6a7438629307d864fea8327d79f3d81169708cccd4d8506d26cce5949e615b94"} Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.188905 4921 scope.go:117] "RemoveContainer" containerID="45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.191290 4921 generic.go:334] "Generic (PLEG): container finished" podID="107bb63f-8854-49d1-b634-ef35890935c5" containerID="560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4" exitCode=143 Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.191358 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"107bb63f-8854-49d1-b634-ef35890935c5","Type":"ContainerDied","Data":"560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4"} Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.191355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" (UID: "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.197477 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" (UID: "b5d3cd82-b9b2-42ab-be5e-949a84dd58b1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.222637 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.222674 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx9z2\" (UniqueName: \"kubernetes.io/projected/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-kube-api-access-lx9z2\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.222686 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.222696 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.222706 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.222715 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.235311 4921 scope.go:117] "RemoveContainer" containerID="3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.260351 4921 scope.go:117] "RemoveContainer" containerID="45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.261038 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004\": container with ID starting with 45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004 not found: ID does not exist" containerID="45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.261072 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004"} err="failed to get container status \"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004\": rpc error: code = NotFound desc = could not find container \"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004\": container with ID starting with 45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004 not found: ID does not exist" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.261094 4921 scope.go:117] "RemoveContainer" containerID="3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.261536 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175\": container with ID starting with 3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175 not found: ID does not exist" containerID="3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.261555 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175"} err="failed to get container status \"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175\": rpc error: code = NotFound desc = could not find container \"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175\": container with ID starting with 3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175 not found: ID does not exist" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.261566 4921 scope.go:117] "RemoveContainer" containerID="45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.261816 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004"} err="failed to get container status \"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004\": rpc error: code = NotFound desc = could not find container \"45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004\": container with ID starting with 45c56dd78537e5d7dfff553c694dece09c1c79cf7357a4e4af0ef3157f846004 not found: ID does not exist" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.261841 4921 scope.go:117] "RemoveContainer" containerID="3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.262033 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175"} err="failed to get container status \"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175\": rpc error: code = NotFound desc = could not find container \"3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175\": container with ID starting with 3e82b7e5f9941a5423148a3e11988e512a983d7261ba41b8d1c6585da7ba5175 not found: ID does not exist" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.499055 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.509692 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.524242 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.524320 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="73d4aa03-5839-43ea-803e-64d12b544e1e" containerName="nova-scheduler-scheduler" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.562987 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.581084 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.594275 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.594757 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-log" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.594771 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-log" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.594787 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="dnsmasq-dns" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.594796 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="dnsmasq-dns" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.594807 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="init" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.594814 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="init" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.594839 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-api" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.594845 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-api" Mar 18 12:34:38 crc kubenswrapper[4921]: E0318 12:34:38.594867 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d494dc6-a889-426d-964e-b168ba155763" containerName="nova-manage" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.594873 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d494dc6-a889-426d-964e-b168ba155763" containerName="nova-manage" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.595071 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-log" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.595083 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="71615a6c-de34-4a90-a680-f916d9813518" containerName="dnsmasq-dns" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.595095 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d494dc6-a889-426d-964e-b168ba155763" containerName="nova-manage" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.595118 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" containerName="nova-api-api" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.596201 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.599216 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.599423 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.599531 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.602835 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.737361 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3271455-7c85-4b68-a27f-fb648ae6abc9-logs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.737437 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-config-data\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.737733 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.737831 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.738037 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.738200 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhkt\" (UniqueName: \"kubernetes.io/projected/a3271455-7c85-4b68-a27f-fb648ae6abc9-kube-api-access-srhkt\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.840299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.840352 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.840413 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.840474 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhkt\" (UniqueName: \"kubernetes.io/projected/a3271455-7c85-4b68-a27f-fb648ae6abc9-kube-api-access-srhkt\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.840577 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3271455-7c85-4b68-a27f-fb648ae6abc9-logs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.840608 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-config-data\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.841482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3271455-7c85-4b68-a27f-fb648ae6abc9-logs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.844687 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.844977 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-config-data\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.848083 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.849729 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.865008 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhkt\" (UniqueName: \"kubernetes.io/projected/a3271455-7c85-4b68-a27f-fb648ae6abc9-kube-api-access-srhkt\") pod \"nova-api-0\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " pod="openstack/nova-api-0" Mar 18 12:34:38 crc kubenswrapper[4921]: I0318 12:34:38.936351 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:34:39 crc kubenswrapper[4921]: I0318 12:34:39.219802 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d3cd82-b9b2-42ab-be5e-949a84dd58b1" path="/var/lib/kubelet/pods/b5d3cd82-b9b2-42ab-be5e-949a84dd58b1/volumes" Mar 18 12:34:39 crc kubenswrapper[4921]: W0318 12:34:39.404581 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3271455_7c85_4b68_a27f_fb648ae6abc9.slice/crio-c642a6e5f87422babb22afbb7db7db58bf22249deba47d2736696b09d658952f WatchSource:0}: Error finding container c642a6e5f87422babb22afbb7db7db58bf22249deba47d2736696b09d658952f: Status 404 returned error can't find the container with id c642a6e5f87422babb22afbb7db7db58bf22249deba47d2736696b09d658952f Mar 18 12:34:39 crc kubenswrapper[4921]: I0318 12:34:39.410351 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:34:40 crc kubenswrapper[4921]: I0318 12:34:40.218333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3271455-7c85-4b68-a27f-fb648ae6abc9","Type":"ContainerStarted","Data":"1a2e4314370ce5c4fae41b6b33bc781b07c039542465afe1cc609a0b0992990b"} Mar 18 12:34:40 crc kubenswrapper[4921]: I0318 12:34:40.218705 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3271455-7c85-4b68-a27f-fb648ae6abc9","Type":"ContainerStarted","Data":"2aace202e5bc4e616801b05d9c08062b31861e303a33d0aee12e11730dc18d7e"} Mar 18 12:34:40 crc kubenswrapper[4921]: I0318 12:34:40.218720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3271455-7c85-4b68-a27f-fb648ae6abc9","Type":"ContainerStarted","Data":"c642a6e5f87422babb22afbb7db7db58bf22249deba47d2736696b09d658952f"} Mar 18 12:34:40 crc kubenswrapper[4921]: I0318 12:34:40.242447 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.242418983 podStartE2EDuration="2.242418983s" podCreationTimestamp="2026-03-18 12:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:40.23564399 +0000 UTC m=+1499.785564649" watchObservedRunningTime="2026-03-18 12:34:40.242418983 +0000 UTC m=+1499.792339652" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.016778 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.084938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-config-data\") pod \"107bb63f-8854-49d1-b634-ef35890935c5\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.085100 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107bb63f-8854-49d1-b634-ef35890935c5-logs\") pod \"107bb63f-8854-49d1-b634-ef35890935c5\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.085192 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-nova-metadata-tls-certs\") pod \"107bb63f-8854-49d1-b634-ef35890935c5\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.085365 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-combined-ca-bundle\") pod \"107bb63f-8854-49d1-b634-ef35890935c5\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.085450 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5dss\" (UniqueName: \"kubernetes.io/projected/107bb63f-8854-49d1-b634-ef35890935c5-kube-api-access-h5dss\") pod \"107bb63f-8854-49d1-b634-ef35890935c5\" (UID: \"107bb63f-8854-49d1-b634-ef35890935c5\") " Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.085583 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107bb63f-8854-49d1-b634-ef35890935c5-logs" (OuterVolumeSpecName: "logs") pod "107bb63f-8854-49d1-b634-ef35890935c5" (UID: "107bb63f-8854-49d1-b634-ef35890935c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.085906 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107bb63f-8854-49d1-b634-ef35890935c5-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.095139 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107bb63f-8854-49d1-b634-ef35890935c5-kube-api-access-h5dss" (OuterVolumeSpecName: "kube-api-access-h5dss") pod "107bb63f-8854-49d1-b634-ef35890935c5" (UID: "107bb63f-8854-49d1-b634-ef35890935c5"). InnerVolumeSpecName "kube-api-access-h5dss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.137428 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-config-data" (OuterVolumeSpecName: "config-data") pod "107bb63f-8854-49d1-b634-ef35890935c5" (UID: "107bb63f-8854-49d1-b634-ef35890935c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.149711 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "107bb63f-8854-49d1-b634-ef35890935c5" (UID: "107bb63f-8854-49d1-b634-ef35890935c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.162325 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "107bb63f-8854-49d1-b634-ef35890935c5" (UID: "107bb63f-8854-49d1-b634-ef35890935c5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.188441 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.188474 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5dss\" (UniqueName: \"kubernetes.io/projected/107bb63f-8854-49d1-b634-ef35890935c5-kube-api-access-h5dss\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.188485 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.188496 4921 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/107bb63f-8854-49d1-b634-ef35890935c5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.229579 4921 generic.go:334] "Generic (PLEG): container finished" podID="107bb63f-8854-49d1-b634-ef35890935c5" containerID="31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732" exitCode=0 Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.230383 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.232463 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"107bb63f-8854-49d1-b634-ef35890935c5","Type":"ContainerDied","Data":"31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732"} Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.232514 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"107bb63f-8854-49d1-b634-ef35890935c5","Type":"ContainerDied","Data":"de9c6e0fdf2d7abccd273af4842f09d6f62f568158ab6ecbdfeb91313ebeeec6"} Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.232533 4921 scope.go:117] "RemoveContainer" containerID="31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.261401 4921 scope.go:117] "RemoveContainer" containerID="560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.277854 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.290799 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.295416 4921 scope.go:117] "RemoveContainer" containerID="31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732" Mar 18 12:34:41 crc kubenswrapper[4921]: E0318 12:34:41.306274 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732\": container with ID starting with 31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732 not found: ID does not exist" containerID="31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.306341 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732"} err="failed to get container status \"31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732\": rpc error: code = NotFound desc = could not find container \"31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732\": container with ID starting with 31ea8fb6b8fbf1f91ca161626a9c6a08c2d4df7edb44b96e0627cf3778ae7732 not found: ID does not exist" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.306371 4921 scope.go:117] "RemoveContainer" containerID="560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4" Mar 18 12:34:41 crc kubenswrapper[4921]: E0318 12:34:41.310248 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4\": container with ID starting with 560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4 not found: ID does not exist" containerID="560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.310296 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4"} err="failed to get container status \"560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4\": rpc error: code = NotFound desc = could not find container \"560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4\": container with ID starting with 560e09f5506d2f83ff4b7894b1069d03b6407bb5765643053b5c1e8ce20af7f4 not found: ID does not exist" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.314178 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:41 crc kubenswrapper[4921]: E0318 12:34:41.329252 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-metadata" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.329300 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-metadata" Mar 18 12:34:41 crc kubenswrapper[4921]: E0318 12:34:41.329320 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-log" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.329327 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-log" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.332362 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-metadata" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.332437 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="107bb63f-8854-49d1-b634-ef35890935c5" containerName="nova-metadata-log" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.343434 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.347238 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.347270 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.360042 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.514497 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3270a214-054c-4c39-aedc-9ba6fb58a7ae-logs\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.514569 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-config-data\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.514617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.514693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvrw\" (UniqueName: \"kubernetes.io/projected/3270a214-054c-4c39-aedc-9ba6fb58a7ae-kube-api-access-4jvrw\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.514750 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.616199 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3270a214-054c-4c39-aedc-9ba6fb58a7ae-logs\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.616264 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-config-data\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.616299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.616359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvrw\" (UniqueName: \"kubernetes.io/projected/3270a214-054c-4c39-aedc-9ba6fb58a7ae-kube-api-access-4jvrw\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.616403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.616780 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3270a214-054c-4c39-aedc-9ba6fb58a7ae-logs\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.621439 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.621563 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-config-data\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.622069 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.636925 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvrw\" (UniqueName: \"kubernetes.io/projected/3270a214-054c-4c39-aedc-9ba6fb58a7ae-kube-api-access-4jvrw\") pod \"nova-metadata-0\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " pod="openstack/nova-metadata-0" Mar 18 12:34:41 crc kubenswrapper[4921]: I0318 12:34:41.706612 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.158920 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.241605 4921 generic.go:334] "Generic (PLEG): container finished" podID="73d4aa03-5839-43ea-803e-64d12b544e1e" containerID="45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4" exitCode=0 Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.241714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"73d4aa03-5839-43ea-803e-64d12b544e1e","Type":"ContainerDied","Data":"45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4"} Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.243931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3270a214-054c-4c39-aedc-9ba6fb58a7ae","Type":"ContainerStarted","Data":"8d691d3627853ed96d50f458350cd23c33e81c02c1e9fb8f9ba5e0f25b6571cd"} Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.493276 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.634299 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glnwk\" (UniqueName: \"kubernetes.io/projected/73d4aa03-5839-43ea-803e-64d12b544e1e-kube-api-access-glnwk\") pod \"73d4aa03-5839-43ea-803e-64d12b544e1e\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.634406 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-combined-ca-bundle\") pod \"73d4aa03-5839-43ea-803e-64d12b544e1e\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.634493 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-config-data\") pod \"73d4aa03-5839-43ea-803e-64d12b544e1e\" (UID: \"73d4aa03-5839-43ea-803e-64d12b544e1e\") " Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.649921 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d4aa03-5839-43ea-803e-64d12b544e1e-kube-api-access-glnwk" (OuterVolumeSpecName: "kube-api-access-glnwk") pod "73d4aa03-5839-43ea-803e-64d12b544e1e" (UID: "73d4aa03-5839-43ea-803e-64d12b544e1e"). InnerVolumeSpecName "kube-api-access-glnwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.666964 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-config-data" (OuterVolumeSpecName: "config-data") pod "73d4aa03-5839-43ea-803e-64d12b544e1e" (UID: "73d4aa03-5839-43ea-803e-64d12b544e1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.668381 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73d4aa03-5839-43ea-803e-64d12b544e1e" (UID: "73d4aa03-5839-43ea-803e-64d12b544e1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.736482 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.736526 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73d4aa03-5839-43ea-803e-64d12b544e1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:42 crc kubenswrapper[4921]: I0318 12:34:42.736541 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glnwk\" (UniqueName: \"kubernetes.io/projected/73d4aa03-5839-43ea-803e-64d12b544e1e-kube-api-access-glnwk\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.219399 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107bb63f-8854-49d1-b634-ef35890935c5" path="/var/lib/kubelet/pods/107bb63f-8854-49d1-b634-ef35890935c5/volumes" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.261845 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.261849 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"73d4aa03-5839-43ea-803e-64d12b544e1e","Type":"ContainerDied","Data":"ab4c585082b022c0a9cd9408a8cbe886894ff9fe187223d42b92de33d41b8b49"} Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.263481 4921 scope.go:117] "RemoveContainer" containerID="45294ec35ef9a07912785117e8bc285edb2faa5d8c7f4e90ba5f0884611570a4" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.264012 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3270a214-054c-4c39-aedc-9ba6fb58a7ae","Type":"ContainerStarted","Data":"2ad13996e9b7e948e5782e01e1bad550f76a615ba18352b010ea6a536fbadc36"} Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.264052 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3270a214-054c-4c39-aedc-9ba6fb58a7ae","Type":"ContainerStarted","Data":"82561021b876e5fc67fb9e3d59a4200ce1a39a49783bf35f15eda35ba9e8e7a5"} Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.303357 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.303315335 podStartE2EDuration="2.303315335s" podCreationTimestamp="2026-03-18 12:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:43.293520237 +0000 UTC m=+1502.843440886" watchObservedRunningTime="2026-03-18 12:34:43.303315335 +0000 UTC m=+1502.853235974" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.334826 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.347214 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.357223 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:43 crc kubenswrapper[4921]: E0318 12:34:43.357720 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d4aa03-5839-43ea-803e-64d12b544e1e" containerName="nova-scheduler-scheduler" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.357743 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d4aa03-5839-43ea-803e-64d12b544e1e" containerName="nova-scheduler-scheduler" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.357982 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d4aa03-5839-43ea-803e-64d12b544e1e" containerName="nova-scheduler-scheduler" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.358715 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.365340 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.366688 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.448762 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gphsf\" (UniqueName: \"kubernetes.io/projected/f3fdc858-ca78-4137-b287-a3015e80b660-kube-api-access-gphsf\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.448950 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.448979 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-config-data\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.550875 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-config-data\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.550980 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gphsf\" (UniqueName: \"kubernetes.io/projected/f3fdc858-ca78-4137-b287-a3015e80b660-kube-api-access-gphsf\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.551203 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.555850 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.559839 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-config-data\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.569182 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gphsf\" (UniqueName: \"kubernetes.io/projected/f3fdc858-ca78-4137-b287-a3015e80b660-kube-api-access-gphsf\") pod \"nova-scheduler-0\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " pod="openstack/nova-scheduler-0" Mar 18 12:34:43 crc kubenswrapper[4921]: I0318 12:34:43.679231 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:34:44 crc kubenswrapper[4921]: I0318 12:34:44.160095 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:34:44 crc kubenswrapper[4921]: W0318 12:34:44.166427 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3fdc858_ca78_4137_b287_a3015e80b660.slice/crio-a57862f2693d20053806fe91152e94fa5450a604dbe305a8611bf85f26240235 WatchSource:0}: Error finding container a57862f2693d20053806fe91152e94fa5450a604dbe305a8611bf85f26240235: Status 404 returned error can't find the container with id a57862f2693d20053806fe91152e94fa5450a604dbe305a8611bf85f26240235 Mar 18 12:34:44 crc kubenswrapper[4921]: I0318 12:34:44.273665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3fdc858-ca78-4137-b287-a3015e80b660","Type":"ContainerStarted","Data":"a57862f2693d20053806fe91152e94fa5450a604dbe305a8611bf85f26240235"} Mar 18 12:34:45 crc kubenswrapper[4921]: I0318 12:34:45.227020 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d4aa03-5839-43ea-803e-64d12b544e1e" path="/var/lib/kubelet/pods/73d4aa03-5839-43ea-803e-64d12b544e1e/volumes" Mar 18 12:34:45 crc kubenswrapper[4921]: I0318 12:34:45.283956 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3fdc858-ca78-4137-b287-a3015e80b660","Type":"ContainerStarted","Data":"fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce"} Mar 18 12:34:45 crc kubenswrapper[4921]: I0318 12:34:45.299840 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.299825015 podStartE2EDuration="2.299825015s" podCreationTimestamp="2026-03-18 12:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:45.298701214 +0000 UTC m=+1504.848621853" watchObservedRunningTime="2026-03-18 12:34:45.299825015 +0000 UTC m=+1504.849745644" Mar 18 12:34:47 crc kubenswrapper[4921]: I0318 12:34:47.081428 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:34:47 crc kubenswrapper[4921]: I0318 12:34:47.081747 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.679472 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.891184 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f4pl6"] Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.893102 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.902678 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4pl6"] Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.937074 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.937415 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.993640 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-utilities\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.993708 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g6gb\" (UniqueName: \"kubernetes.io/projected/0729ed4f-14cb-4ece-a0da-856a1d59359a-kube-api-access-7g6gb\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:48 crc kubenswrapper[4921]: I0318 12:34:48.993736 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-catalog-content\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.095799 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-utilities\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.095920 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g6gb\" (UniqueName: \"kubernetes.io/projected/0729ed4f-14cb-4ece-a0da-856a1d59359a-kube-api-access-7g6gb\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.095955 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-catalog-content\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.096220 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-utilities\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.096715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-catalog-content\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.112471 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g6gb\" (UniqueName: \"kubernetes.io/projected/0729ed4f-14cb-4ece-a0da-856a1d59359a-kube-api-access-7g6gb\") pod \"certified-operators-f4pl6\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.214976 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.746586 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4pl6"] Mar 18 12:34:49 crc kubenswrapper[4921]: W0318 12:34:49.756296 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0729ed4f_14cb_4ece_a0da_856a1d59359a.slice/crio-8acebaaf31a916b5ae7f0771d99c41d97acb7f7c93e37b2797e3e14195db9875 WatchSource:0}: Error finding container 8acebaaf31a916b5ae7f0771d99c41d97acb7f7c93e37b2797e3e14195db9875: Status 404 returned error can't find the container with id 8acebaaf31a916b5ae7f0771d99c41d97acb7f7c93e37b2797e3e14195db9875 Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.953288 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:49 crc kubenswrapper[4921]: I0318 12:34:49.953333 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:50 crc kubenswrapper[4921]: I0318 12:34:50.331922 4921 generic.go:334] "Generic (PLEG): container finished" podID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerID="5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33" exitCode=0 Mar 18 12:34:50 crc kubenswrapper[4921]: I0318 12:34:50.331972 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerDied","Data":"5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33"} Mar 18 12:34:50 crc kubenswrapper[4921]: I0318 12:34:50.332003 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerStarted","Data":"8acebaaf31a916b5ae7f0771d99c41d97acb7f7c93e37b2797e3e14195db9875"} Mar 18 12:34:51 crc kubenswrapper[4921]: I0318 12:34:51.342585 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerStarted","Data":"d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5"} Mar 18 12:34:51 crc kubenswrapper[4921]: I0318 12:34:51.706900 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:34:51 crc kubenswrapper[4921]: I0318 12:34:51.706945 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:34:52 crc kubenswrapper[4921]: I0318 12:34:52.355324 4921 generic.go:334] "Generic (PLEG): container finished" podID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerID="d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5" exitCode=0 Mar 18 12:34:52 crc kubenswrapper[4921]: I0318 12:34:52.355392 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerDied","Data":"d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5"} Mar 18 12:34:52 crc kubenswrapper[4921]: I0318 12:34:52.719289 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:52 crc kubenswrapper[4921]: I0318 12:34:52.719305 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:34:53 crc kubenswrapper[4921]: I0318 12:34:53.370194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerStarted","Data":"d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef"} Mar 18 12:34:53 crc kubenswrapper[4921]: I0318 12:34:53.396452 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f4pl6" podStartSLOduration=2.841236336 podStartE2EDuration="5.396422564s" podCreationTimestamp="2026-03-18 12:34:48 +0000 UTC" firstStartedPulling="2026-03-18 12:34:50.333601556 +0000 UTC m=+1509.883522195" lastFinishedPulling="2026-03-18 12:34:52.888787784 +0000 UTC m=+1512.438708423" observedRunningTime="2026-03-18 12:34:53.387083779 +0000 UTC m=+1512.937004448" watchObservedRunningTime="2026-03-18 12:34:53.396422564 +0000 UTC m=+1512.946343213" Mar 18 12:34:53 crc kubenswrapper[4921]: I0318 12:34:53.679557 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:34:53 crc kubenswrapper[4921]: I0318 12:34:53.708353 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:34:54 crc kubenswrapper[4921]: I0318 12:34:54.404491 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:34:56 crc kubenswrapper[4921]: I0318 12:34:56.937137 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:34:56 crc kubenswrapper[4921]: I0318 12:34:56.937475 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:34:58 crc kubenswrapper[4921]: I0318 12:34:58.494851 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:34:58 crc kubenswrapper[4921]: I0318 12:34:58.945668 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:34:58 crc kubenswrapper[4921]: I0318 12:34:58.946583 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:34:58 crc kubenswrapper[4921]: I0318 12:34:58.957186 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.220089 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.220164 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.262260 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.459315 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.504338 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.588368 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4pl6"] Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.706950 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:34:59 crc kubenswrapper[4921]: I0318 12:34:59.707034 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:35:01 crc kubenswrapper[4921]: I0318 12:35:01.256058 4921 scope.go:117] "RemoveContainer" containerID="c1aa42dc58755d43fc6bf73449b283c67dd1eb8ba8fa8c52819510041fa53028" Mar 18 12:35:01 crc kubenswrapper[4921]: I0318 12:35:01.484902 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f4pl6" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="registry-server" containerID="cri-o://d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef" gracePeriod=2 Mar 18 12:35:01 crc kubenswrapper[4921]: I0318 12:35:01.712637 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:35:01 crc kubenswrapper[4921]: I0318 12:35:01.715538 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:35:01 crc kubenswrapper[4921]: I0318 12:35:01.719837 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:35:01 crc kubenswrapper[4921]: I0318 12:35:01.990599 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.156368 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g6gb\" (UniqueName: \"kubernetes.io/projected/0729ed4f-14cb-4ece-a0da-856a1d59359a-kube-api-access-7g6gb\") pod \"0729ed4f-14cb-4ece-a0da-856a1d59359a\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.156460 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-catalog-content\") pod \"0729ed4f-14cb-4ece-a0da-856a1d59359a\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.156499 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-utilities\") pod \"0729ed4f-14cb-4ece-a0da-856a1d59359a\" (UID: \"0729ed4f-14cb-4ece-a0da-856a1d59359a\") " Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.157820 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-utilities" (OuterVolumeSpecName: "utilities") pod "0729ed4f-14cb-4ece-a0da-856a1d59359a" (UID: "0729ed4f-14cb-4ece-a0da-856a1d59359a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.170357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0729ed4f-14cb-4ece-a0da-856a1d59359a-kube-api-access-7g6gb" (OuterVolumeSpecName: "kube-api-access-7g6gb") pod "0729ed4f-14cb-4ece-a0da-856a1d59359a" (UID: "0729ed4f-14cb-4ece-a0da-856a1d59359a"). InnerVolumeSpecName "kube-api-access-7g6gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.213200 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0729ed4f-14cb-4ece-a0da-856a1d59359a" (UID: "0729ed4f-14cb-4ece-a0da-856a1d59359a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.258654 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g6gb\" (UniqueName: \"kubernetes.io/projected/0729ed4f-14cb-4ece-a0da-856a1d59359a-kube-api-access-7g6gb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.258701 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.258711 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0729ed4f-14cb-4ece-a0da-856a1d59359a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.498020 4921 generic.go:334] "Generic (PLEG): container finished" podID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerID="d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef" exitCode=0 Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.498100 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4pl6" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.498092 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerDied","Data":"d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef"} Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.499564 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4pl6" event={"ID":"0729ed4f-14cb-4ece-a0da-856a1d59359a","Type":"ContainerDied","Data":"8acebaaf31a916b5ae7f0771d99c41d97acb7f7c93e37b2797e3e14195db9875"} Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.499601 4921 scope.go:117] "RemoveContainer" containerID="d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.505090 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.533852 4921 scope.go:117] "RemoveContainer" containerID="d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.564032 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4pl6"] Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.598912 4921 scope.go:117] "RemoveContainer" containerID="5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.607945 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f4pl6"] Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.654749 4921 scope.go:117] "RemoveContainer" containerID="d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef" Mar 18 12:35:02 crc kubenswrapper[4921]: E0318 12:35:02.655881 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef\": container with ID starting with d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef not found: ID does not exist" containerID="d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.655974 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef"} err="failed to get container status \"d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef\": rpc error: code = NotFound desc = could not find container \"d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef\": container with ID starting with d221f4441ae8c94c01cfffdabfde5fbca44728b8b07442cf5f474c6be2b2e9ef not found: ID does not exist" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.656051 4921 scope.go:117] "RemoveContainer" containerID="d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5" Mar 18 12:35:02 crc kubenswrapper[4921]: E0318 12:35:02.657565 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5\": container with ID starting with d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5 not found: ID does not exist" containerID="d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.657624 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5"} err="failed to get container status \"d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5\": rpc error: code = NotFound desc = could not find container \"d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5\": container with ID starting with d55dc4496fbbc52c43426ca1418aeb397cdc189e552c8c5da0474bbd4a4805f5 not found: ID does not exist" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.657656 4921 scope.go:117] "RemoveContainer" containerID="5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33" Mar 18 12:35:02 crc kubenswrapper[4921]: E0318 12:35:02.657986 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33\": container with ID starting with 5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33 not found: ID does not exist" containerID="5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33" Mar 18 12:35:02 crc kubenswrapper[4921]: I0318 12:35:02.658023 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33"} err="failed to get container status \"5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33\": rpc error: code = NotFound desc = could not find container \"5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33\": container with ID starting with 5c4e150c07f43ac2e69c53bb622c06fd55d9b356f973751598d905e80bc3fd33 not found: ID does not exist" Mar 18 12:35:03 crc kubenswrapper[4921]: I0318 12:35:03.221590 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" path="/var/lib/kubelet/pods/0729ed4f-14cb-4ece-a0da-856a1d59359a/volumes" Mar 18 12:35:17 crc kubenswrapper[4921]: I0318 12:35:17.081153 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:35:17 crc kubenswrapper[4921]: I0318 12:35:17.081716 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.491344 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.509328 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" containerName="openstackclient" containerID="cri-o://5a943b023196d78dd7c6404da7d91670ba6f169429e0a23456b547a01a996a57" gracePeriod=2 Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.516833 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.672923 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.776361 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ftdrz"] Mar 18 12:35:21 crc kubenswrapper[4921]: E0318 12:35:21.793461 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 12:35:21 crc kubenswrapper[4921]: E0318 12:35:21.793528 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data podName:df692663-cc58-4cf1-a05b-566e0152ee90 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:22.293507811 +0000 UTC m=+1541.843428450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data") pod "rabbitmq-server-0" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90") : configmap "rabbitmq-config-data" not found Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.815439 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ftdrz"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.835190 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mrt7s"] Mar 18 12:35:21 crc kubenswrapper[4921]: E0318 12:35:21.835698 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" containerName="openstackclient" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.835724 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" containerName="openstackclient" Mar 18 12:35:21 crc kubenswrapper[4921]: E0318 12:35:21.835758 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="registry-server" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.835768 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="registry-server" Mar 18 12:35:21 crc kubenswrapper[4921]: E0318 12:35:21.835783 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="extract-content" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.835792 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="extract-content" Mar 18 12:35:21 crc kubenswrapper[4921]: E0318 12:35:21.835807 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="extract-utilities" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.835817 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="extract-utilities" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.836028 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0729ed4f-14cb-4ece-a0da-856a1d59359a" containerName="registry-server" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.836057 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" containerName="openstackclient" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.836890 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.852573 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.859019 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.859476 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="openstack-network-exporter" containerID="cri-o://208313c91671ca6bab23b1fd323268c213a12034b1cf7b7e4deee0560bc9b81b" gracePeriod=300 Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.895372 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts\") pod \"root-account-create-update-mrt7s\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.895480 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpthc\" (UniqueName: \"kubernetes.io/projected/6fb2928a-2c88-4045-bc13-a7dca96f9639-kube-api-access-gpthc\") pod \"root-account-create-update-mrt7s\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.895726 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mrt7s"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.930196 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3a84-account-create-update-9plfj"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.948384 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3a84-account-create-update-9plfj"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.963582 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nszlz"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.970329 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="ovsdbserver-nb" containerID="cri-o://e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740" gracePeriod=300 Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.984897 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nszlz"] Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.997303 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpthc\" (UniqueName: \"kubernetes.io/projected/6fb2928a-2c88-4045-bc13-a7dca96f9639-kube-api-access-gpthc\") pod \"root-account-create-update-mrt7s\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.997541 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts\") pod \"root-account-create-update-mrt7s\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:21 crc kubenswrapper[4921]: I0318 12:35:21.998441 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts\") pod \"root-account-create-update-mrt7s\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.032859 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpthc\" (UniqueName: \"kubernetes.io/projected/6fb2928a-2c88-4045-bc13-a7dca96f9639-kube-api-access-gpthc\") pod \"root-account-create-update-mrt7s\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.042030 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9fc8-account-create-update-k6glt"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.057663 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9fc8-account-create-update-k6glt"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.108448 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.109200 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="ovn-northd" containerID="cri-o://513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" gracePeriod=30 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.109390 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="openstack-network-exporter" containerID="cri-o://487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4" gracePeriod=30 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.149625 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.190264 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.212078 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.212200 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data podName:ef935990-b291-43b7-9d56-673b7b05a7a7 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:22.712182084 +0000 UTC m=+1542.262102723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data") pod "rabbitmq-cell1-server-0" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7") : configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.227150 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a2f6-account-create-update-229zv"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.264142 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p6tzr"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.264340 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-p6tzr" podUID="4bedf417-75a0-4163-88ee-c11ea02ae1f4" containerName="openstack-network-exporter" containerID="cri-o://f8e01516efcfac1821ca3c35e166cc122d12a03ad3ebfd843f89a0549e5ee5c2" gracePeriod=30 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.273889 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a2f6-account-create-update-229zv"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.284082 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-bg8nq"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.316213 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-djh4f"] Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.318322 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.318358 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data podName:df692663-cc58-4cf1-a05b-566e0152ee90 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:23.318345738 +0000 UTC m=+1542.868266377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data") pod "rabbitmq-server-0" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90") : configmap "rabbitmq-config-data" not found Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.369594 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbkfz"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.397993 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbkfz"] Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.528436 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740 is running failed: container process not found" containerID="e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.555315 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740 is running failed: container process not found" containerID="e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.563305 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-16b4-account-create-update-5nmq2"] Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.602302 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740 is running failed: container process not found" containerID="e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.602364 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="ovsdbserver-nb" Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.613599 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-16b4-account-create-update-5nmq2"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.670459 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jn2s5"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.729306 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jn2s5"] Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.774920 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:22 crc kubenswrapper[4921]: E0318 12:35:22.774980 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data podName:ef935990-b291-43b7-9d56-673b7b05a7a7 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:23.774964859 +0000 UTC m=+1543.324885498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data") pod "rabbitmq-cell1-server-0" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7") : configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.788963 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-p4hhn"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.822483 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p6tzr_4bedf417-75a0-4163-88ee-c11ea02ae1f4/openstack-network-exporter/0.log" Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.822536 4921 generic.go:334] "Generic (PLEG): container finished" podID="4bedf417-75a0-4163-88ee-c11ea02ae1f4" containerID="f8e01516efcfac1821ca3c35e166cc122d12a03ad3ebfd843f89a0549e5ee5c2" exitCode=2 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.822608 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p6tzr" event={"ID":"4bedf417-75a0-4163-88ee-c11ea02ae1f4","Type":"ContainerDied","Data":"f8e01516efcfac1821ca3c35e166cc122d12a03ad3ebfd843f89a0549e5ee5c2"} Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.882017 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fcnk6"] Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.889970 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf100e1-595d-4dce-9125-c27db7e9408a/ovsdbserver-nb/0.log" Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.890017 4921 generic.go:334] "Generic (PLEG): container finished" podID="abf100e1-595d-4dce-9125-c27db7e9408a" containerID="208313c91671ca6bab23b1fd323268c213a12034b1cf7b7e4deee0560bc9b81b" exitCode=2 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.890035 4921 generic.go:334] "Generic (PLEG): container finished" podID="abf100e1-595d-4dce-9125-c27db7e9408a" containerID="e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740" exitCode=143 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.890084 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf100e1-595d-4dce-9125-c27db7e9408a","Type":"ContainerDied","Data":"208313c91671ca6bab23b1fd323268c213a12034b1cf7b7e4deee0560bc9b81b"} Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.890128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf100e1-595d-4dce-9125-c27db7e9408a","Type":"ContainerDied","Data":"e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740"} Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.936816 4921 generic.go:334] "Generic (PLEG): container finished" podID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerID="487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4" exitCode=2 Mar 18 12:35:22 crc kubenswrapper[4921]: I0318 12:35:22.936854 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0","Type":"ContainerDied","Data":"487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4"} Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.017509 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fcnk6"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.079626 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-p4hhn"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.128581 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-fbcg4"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.128791 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerName="dnsmasq-dns" containerID="cri-o://696963e76f2c9834064e1cc263a38d695183d851da7f00d1ff0c9af0d196e352" gracePeriod=10 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.175141 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1264-account-create-update-djvfr"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.254309 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0022dc9f-31d2-440f-831a-ae0a03c22b63" path="/var/lib/kubelet/pods/0022dc9f-31d2-440f-831a-ae0a03c22b63/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.255549 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09aa0ac1-8a57-4e30-b283-8a00711fa9df" path="/var/lib/kubelet/pods/09aa0ac1-8a57-4e30-b283-8a00711fa9df/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.260230 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17db594b-8493-4668-ab55-ed7c9f41db14" path="/var/lib/kubelet/pods/17db594b-8493-4668-ab55-ed7c9f41db14/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.260883 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d494dc6-a889-426d-964e-b168ba155763" path="/var/lib/kubelet/pods/1d494dc6-a889-426d-964e-b168ba155763/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.261507 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ebd4aa-2278-4794-a26d-a26333a7fae3" path="/var/lib/kubelet/pods/33ebd4aa-2278-4794-a26d-a26333a7fae3/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.273488 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cce4ccd-6a44-4c00-b7f5-9c74946eb308" path="/var/lib/kubelet/pods/5cce4ccd-6a44-4c00-b7f5-9c74946eb308/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.285202 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0f23d5-c8db-45a5-8c30-c746a0157440" path="/var/lib/kubelet/pods/8f0f23d5-c8db-45a5-8c30-c746a0157440/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.291261 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7214a61-816f-44f3-8ce2-3110d5819ad5" path="/var/lib/kubelet/pods/c7214a61-816f-44f3-8ce2-3110d5819ad5/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.292018 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7747f3a-18ba-402a-8d56-d6f7d359f1de" path="/var/lib/kubelet/pods/e7747f3a-18ba-402a-8d56-d6f7d359f1de/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.292702 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3477f12-9377-4146-a6a0-1cf5fa1c9dae" path="/var/lib/kubelet/pods/f3477f12-9377-4146-a6a0-1cf5fa1c9dae/volumes" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.293456 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1264-account-create-update-djvfr"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.293498 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-j94bd"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.293514 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ghx6h"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.293530 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-j94bd"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.293549 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ghx6h"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.299236 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.299514 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="cinder-scheduler" containerID="cri-o://b6f6eac2f1ae64a2931ebab6341ccdde8403127e4d28bf10d8ffd9bcbe4a8ea3" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.299965 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="probe" containerID="cri-o://ba3db590f738d1f28baa164d30603e96d6981b683ac84eb657fd01aa1145a053" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.318486 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.318934 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-server" containerID="cri-o://0b7785aa69c2d4d5a0513e84fe33227f3ad20c98b78d1dcca6b047589db0a914" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319613 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="swift-recon-cron" containerID="cri-o://5417c0aaf863e518433565af42abbad6a0c5b335eef0766c35d94f92e5627f39" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319673 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="rsync" containerID="cri-o://d92a64abd1a53e46dcafdafcc8d4d1c74904044d5ba50721426f103d435d57d1" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319703 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-expirer" containerID="cri-o://e1f5e00485fe3c35e3ec69acbb2f60126c30dad072a5acf86f531dc05351e016" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319733 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-updater" containerID="cri-o://a68d70fd22c882e995ded0c62216a18073ba6612f41913c598c593d06c61a6b2" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319766 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-auditor" containerID="cri-o://0ab56600506809bec91a2b7ae6b9bf4d001cdb5c75b88b21a9af00d3e3d40e90" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319796 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-replicator" containerID="cri-o://fbed1f40b33a5fa1094364d62d483881bd04228924310bd51d1435c0c89e479b" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319827 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-server" containerID="cri-o://6fa2fcae87945f7dd516860dd504658f8b9dc554af18972aba630feda6408da7" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319884 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-updater" containerID="cri-o://6ce294b06257e3ab13597923a81988d1346d5378dc3751bcc0f9a4ac4134d520" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319922 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-auditor" containerID="cri-o://1c245944ddb7cd5c122c6cc477fd4b8c17707a0b034cb749ccf88bd64991b476" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319952 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-replicator" containerID="cri-o://f44537122f931a3e66acf483b594422f9af64976005b3c0018487d261e996304" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.319982 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-server" containerID="cri-o://1f7ae1ab24fc1064033fd0503c3706bdd8dbdc4d41ba5cab405e7ab75a73598f" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.320010 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-reaper" containerID="cri-o://1777c4dc6cc0ebbcf08c1415f64541bce60850c8378a90e1f39c95269a83f819" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.320039 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-auditor" containerID="cri-o://89a3992a11b9a42578661ade69e99403032115ef433aaf0df1389b585d36e00b" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.320069 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-replicator" containerID="cri-o://72ba457268b54fa0d33c7866b23bca8be1894d0a484abe9be4ab2fd6c11abae3" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.337518 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: E0318 12:35:23.351722 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 12:35:23 crc kubenswrapper[4921]: E0318 12:35:23.351787 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data podName:df692663-cc58-4cf1-a05b-566e0152ee90 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:25.351771171 +0000 UTC m=+1544.901691810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data") pod "rabbitmq-server-0" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90") : configmap "rabbitmq-config-data" not found Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.355695 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api-log" containerID="cri-o://7309c213d8e4e3cb567c026e50b6cdc87298f82a2247b0633434b2e4b5e65f3c" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.355867 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api" containerID="cri-o://d0425a7018f55d15fcb50cbcccfeff3feea433294ce78fd911a563de3145ed79" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.366826 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.367181 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="openstack-network-exporter" containerID="cri-o://84aeae41e48cf9f3868fdc45a2a0e25ada202b804f77350b8ae8019f82805fc0" gracePeriod=300 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.402322 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7wrc2"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.419289 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7wrc2"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.480959 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.505550 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cc469779b-2mfpc"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.505939 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cc469779b-2mfpc" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-log" containerID="cri-o://ec64628734d6d0cbd273786bbce180980c769e1b8b9804c0a9567e94a61a1793" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.506606 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cc469779b-2mfpc" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-api" containerID="cri-o://21996dae0b2690b430cf03b884c33f3ef56bfe6e6623a7ddd63437c6d50e1ff5" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.532856 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.533131 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-log" containerID="cri-o://82561021b876e5fc67fb9e3d59a4200ce1a39a49783bf35f15eda35ba9e8e7a5" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.533249 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-metadata" containerID="cri-o://2ad13996e9b7e948e5782e01e1bad550f76a615ba18352b010ea6a536fbadc36" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.544579 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jt4qq"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.554380 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jt4qq"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.568659 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-rx99t"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.587851 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-rx99t"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.607935 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.608185 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-log" containerID="cri-o://ed60a39c31c959721c7809f4880462dd279124305bcfe7b9ac91a6dcce1bd85e" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.608472 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-httpd" containerID="cri-o://db9a0c1811b401c4f15334f71054371c07cdb06d6ce632ece0b7217159b060b3" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.623229 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4d6e-account-create-update-ksj2x"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.659668 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4d6e-account-create-update-ksj2x"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.691184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.691428 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-log" containerID="cri-o://e282050c14e51eefd65b0a5667448f8285cab09cc7f7c0ec5267fa01ddcbb423" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.691813 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-httpd" containerID="cri-o://404fbdad0acbc101609fe321e0ebd443a1518297cc849df46edbfb574ac4328a" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.713879 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.714415 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-log" containerID="cri-o://2aace202e5bc4e616801b05d9c08062b31861e303a33d0aee12e11730dc18d7e" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.714910 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-api" containerID="cri-o://1a2e4314370ce5c4fae41b6b33bc781b07c039542465afe1cc609a0b0992990b" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.744189 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qhdnd"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.762673 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:35:23 crc kubenswrapper[4921]: E0318 12:35:23.778293 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:23 crc kubenswrapper[4921]: E0318 12:35:23.778383 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data podName:ef935990-b291-43b7-9d56-673b7b05a7a7 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:25.77836542 +0000 UTC m=+1545.328286059 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data") pod "rabbitmq-cell1-server-0" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7") : configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.781537 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qhdnd"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.815402 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f8c57d85f-gdqrq"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.816059 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-httpd" containerID="cri-o://f7216c7a86cd828cfa2701494b171fae9e8224c04fb41a1372f7e7cb90e5cf3a" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.816378 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-server" containerID="cri-o://8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5" gracePeriod=30 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.842615 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="ovsdbserver-sb" containerID="cri-o://4eb6151343cc2ec88716d0980c3d56c2d49d23288a278634cdfaa260afdf18bd" gracePeriod=300 Mar 18 12:35:23 crc kubenswrapper[4921]: E0318 12:35:23.865654 4921 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-djh4f" message="Exiting ovn-controller (1) " Mar 18 12:35:23 crc kubenswrapper[4921]: E0318 12:35:23.865717 4921 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-djh4f" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" containerID="cri-o://91ea66d0192bbffc94a048cad7e99c8827ba926ea7b1b2762795c9d6c5b354a9" Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.865764 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-djh4f" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" containerID="cri-o://91ea66d0192bbffc94a048cad7e99c8827ba926ea7b1b2762795c9d6c5b354a9" gracePeriod=29 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.877538 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ppcmx"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.907265 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ppcmx"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.932310 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6c41-account-create-update-qsnfn"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.962012 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9n4lr"] Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.962506 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="rabbitmq" containerID="cri-o://330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537" gracePeriod=604800 Mar 18 12:35:23 crc kubenswrapper[4921]: I0318 12:35:23.975982 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kjxvt"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.006354 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aea5-account-create-update-glsg4"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.050842 4921 generic.go:334] "Generic (PLEG): container finished" podID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerID="696963e76f2c9834064e1cc263a38d695183d851da7f00d1ff0c9af0d196e352" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.050917 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" event={"ID":"66ca2e8c-3d98-4448-afa4-8a278b5f0c54","Type":"ContainerDied","Data":"696963e76f2c9834064e1cc263a38d695183d851da7f00d1ff0c9af0d196e352"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.080283 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9n4lr"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.112886 4921 generic.go:334] "Generic (PLEG): container finished" podID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerID="ed60a39c31c959721c7809f4880462dd279124305bcfe7b9ac91a6dcce1bd85e" exitCode=143 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.112980 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf","Type":"ContainerDied","Data":"ed60a39c31c959721c7809f4880462dd279124305bcfe7b9ac91a6dcce1bd85e"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.144445 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6c41-account-create-update-qsnfn"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.177048 4921 generic.go:334] "Generic (PLEG): container finished" podID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerID="91ea66d0192bbffc94a048cad7e99c8827ba926ea7b1b2762795c9d6c5b354a9" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.187098 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f" event={"ID":"3ee31803-52cb-4fcd-8ab1-990b0440a67a","Type":"ContainerDied","Data":"91ea66d0192bbffc94a048cad7e99c8827ba926ea7b1b2762795c9d6c5b354a9"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.202078 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kjxvt"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.228743 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-aea5-account-create-update-glsg4"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.228799 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.240176 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2rjxw"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.240891 4921 generic.go:334] "Generic (PLEG): container finished" podID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerID="2aace202e5bc4e616801b05d9c08062b31861e303a33d0aee12e11730dc18d7e" exitCode=143 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.241031 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3271455-7c85-4b68-a27f-fb648ae6abc9","Type":"ContainerDied","Data":"2aace202e5bc4e616801b05d9c08062b31861e303a33d0aee12e11730dc18d7e"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.254426 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86b7dc884f-42l8h"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.254647 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86b7dc884f-42l8h" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-api" containerID="cri-o://93b745602f033ec1a6fdfa900798b921c5b73da266fffb39f1ceaa11e6e673d5" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.254972 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86b7dc884f-42l8h" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-httpd" containerID="cri-o://39e6a8d2a389dad40dfed60f96860330eb4be0119c8e89e962a17c2926a11993" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.257752 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2rjxw"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.274384 4921 generic.go:334] "Generic (PLEG): container finished" podID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerID="82561021b876e5fc67fb9e3d59a4200ce1a39a49783bf35f15eda35ba9e8e7a5" exitCode=143 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.274446 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3270a214-054c-4c39-aedc-9ba6fb58a7ae","Type":"ContainerDied","Data":"82561021b876e5fc67fb9e3d59a4200ce1a39a49783bf35f15eda35ba9e8e7a5"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.281001 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-j69tq"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.282713 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="rabbitmq" containerID="cri-o://0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222" gracePeriod=604800 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.292454 4921 generic.go:334] "Generic (PLEG): container finished" podID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerID="ec64628734d6d0cbd273786bbce180980c769e1b8b9804c0a9567e94a61a1793" exitCode=143 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.292561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc469779b-2mfpc" event={"ID":"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71","Type":"ContainerDied","Data":"ec64628734d6d0cbd273786bbce180980c769e1b8b9804c0a9567e94a61a1793"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.294769 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-j69tq"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.300915 4921 generic.go:334] "Generic (PLEG): container finished" podID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" containerID="5a943b023196d78dd7c6404da7d91670ba6f169429e0a23456b547a01a996a57" exitCode=137 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.303597 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-689c8956b9-wzd7n"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.303793 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-689c8956b9-wzd7n" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker-log" containerID="cri-o://a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.304105 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-689c8956b9-wzd7n" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker" containerID="cri-o://59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.324402 4921 generic.go:334] "Generic (PLEG): container finished" podID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerID="7309c213d8e4e3cb567c026e50b6cdc87298f82a2247b0633434b2e4b5e65f3c" exitCode=143 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.324482 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b574f65d-9f59-41e3-bec6-59c25cc847fe","Type":"ContainerDied","Data":"7309c213d8e4e3cb567c026e50b6cdc87298f82a2247b0633434b2e4b5e65f3c"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.329556 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.329878 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b032b317-787f-4f39-bf12-aff187fb862f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f0d4324420151331450e6d095a9a2716913bafe1a8e2ec54e2ee054bd54bde23" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.344506 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" containerID="cri-o://8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" gracePeriod=28 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.369184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-c47f5f4db-j6swx"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.369444 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener-log" containerID="cri-o://d769e05c72dbc3bfb0249264660c0d34a67576be10d78c4d0f42227432990ac5" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.369798 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener" containerID="cri-o://d3111e0bf89186d6ceed4c9cbb069267d0c8685ca3cf5793ac787eddf6cf6018" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.398490 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cf79cb9db-9pf9t"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.398802 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cf79cb9db-9pf9t" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api-log" containerID="cri-o://199e0bb74b18c94897363ce6c49342390238bfcfe306768c0912bdc03eeb27b4" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.398949 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cf79cb9db-9pf9t" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api" containerID="cri-o://cfefeafefb675bac464d8262a5b628863032fde78483031396ecf5c2c726f1af" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.403372 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e1887f67-2204-45c0-be34-9471594f217c/ovsdbserver-sb/0.log" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.403424 4921 generic.go:334] "Generic (PLEG): container finished" podID="e1887f67-2204-45c0-be34-9471594f217c" containerID="84aeae41e48cf9f3868fdc45a2a0e25ada202b804f77350b8ae8019f82805fc0" exitCode=2 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.403505 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1887f67-2204-45c0-be34-9471594f217c","Type":"ContainerDied","Data":"84aeae41e48cf9f3868fdc45a2a0e25ada202b804f77350b8ae8019f82805fc0"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.417802 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.418030 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="1a183a61-e314-4bd0-b332-3d216d70c6c2" containerName="nova-cell1-conductor-conductor" containerID="cri-o://3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.453324 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68h69"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.467987 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-68h69"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.477897 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.478127 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.479558 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p6tzr_4bedf417-75a0-4163-88ee-c11ea02ae1f4/openstack-network-exporter/0.log" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.479628 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.496946 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="e1f5e00485fe3c35e3ec69acbb2f60126c30dad072a5acf86f531dc05351e016" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.496987 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="a68d70fd22c882e995ded0c62216a18073ba6612f41913c598c593d06c61a6b2" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.496997 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="0ab56600506809bec91a2b7ae6b9bf4d001cdb5c75b88b21a9af00d3e3d40e90" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497006 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="fbed1f40b33a5fa1094364d62d483881bd04228924310bd51d1435c0c89e479b" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497015 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="6ce294b06257e3ab13597923a81988d1346d5378dc3751bcc0f9a4ac4134d520" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497023 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="1c245944ddb7cd5c122c6cc477fd4b8c17707a0b034cb749ccf88bd64991b476" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497031 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="f44537122f931a3e66acf483b594422f9af64976005b3c0018487d261e996304" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497040 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="1f7ae1ab24fc1064033fd0503c3706bdd8dbdc4d41ba5cab405e7ab75a73598f" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497047 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="1777c4dc6cc0ebbcf08c1415f64541bce60850c8378a90e1f39c95269a83f819" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497055 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="89a3992a11b9a42578661ade69e99403032115ef433aaf0df1389b585d36e00b" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497063 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="72ba457268b54fa0d33c7866b23bca8be1894d0a484abe9be4ab2fd6c11abae3" exitCode=0 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497088 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"e1f5e00485fe3c35e3ec69acbb2f60126c30dad072a5acf86f531dc05351e016"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"a68d70fd22c882e995ded0c62216a18073ba6612f41913c598c593d06c61a6b2"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497160 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"0ab56600506809bec91a2b7ae6b9bf4d001cdb5c75b88b21a9af00d3e3d40e90"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497174 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"fbed1f40b33a5fa1094364d62d483881bd04228924310bd51d1435c0c89e479b"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"6ce294b06257e3ab13597923a81988d1346d5378dc3751bcc0f9a4ac4134d520"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497202 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"1c245944ddb7cd5c122c6cc477fd4b8c17707a0b034cb749ccf88bd64991b476"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497214 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"f44537122f931a3e66acf483b594422f9af64976005b3c0018487d261e996304"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497226 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"1f7ae1ab24fc1064033fd0503c3706bdd8dbdc4d41ba5cab405e7ab75a73598f"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497238 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"1777c4dc6cc0ebbcf08c1415f64541bce60850c8378a90e1f39c95269a83f819"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497248 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"89a3992a11b9a42578661ade69e99403032115ef433aaf0df1389b585d36e00b"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.497260 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"72ba457268b54fa0d33c7866b23bca8be1894d0a484abe9be4ab2fd6c11abae3"} Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.498145 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf100e1-595d-4dce-9125-c27db7e9408a/ovsdbserver-nb/0.log" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.498227 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.516056 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfzsh"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.520229 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.523315 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfzsh"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.552480 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerName="galera" containerID="cri-o://40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277" gracePeriod=29 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.558423 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mrt7s"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.576211 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.576478 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f3fdc858-ca78-4137-b287-a3015e80b660" containerName="nova-scheduler-scheduler" containerID="cri-o://fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce" gracePeriod=30 Mar 18 12:35:24 crc kubenswrapper[4921]: E0318 12:35:24.577517 4921 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 12:35:24 crc kubenswrapper[4921]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 12:35:24 crc kubenswrapper[4921]: + source /usr/local/bin/container-scripts/functions Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNBridge=br-int Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNRemote=tcp:localhost:6642 Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNEncapType=geneve Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNAvailabilityZones= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ EnableChassisAsGateway=true Mar 18 12:35:24 crc kubenswrapper[4921]: ++ PhysicalNetworks= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNHostName= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 12:35:24 crc kubenswrapper[4921]: ++ ovs_dir=/var/lib/openvswitch Mar 18 12:35:24 crc kubenswrapper[4921]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 12:35:24 crc kubenswrapper[4921]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 12:35:24 crc kubenswrapper[4921]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + cleanup_ovsdb_server_semaphore Mar 18 12:35:24 crc kubenswrapper[4921]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 12:35:24 crc kubenswrapper[4921]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 12:35:24 crc kubenswrapper[4921]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-bg8nq" message=< Mar 18 12:35:24 crc kubenswrapper[4921]: Exiting ovsdb-server (5) [ OK ] Mar 18 12:35:24 crc kubenswrapper[4921]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 12:35:24 crc kubenswrapper[4921]: + source /usr/local/bin/container-scripts/functions Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNBridge=br-int Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNRemote=tcp:localhost:6642 Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNEncapType=geneve Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNAvailabilityZones= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ EnableChassisAsGateway=true Mar 18 12:35:24 crc kubenswrapper[4921]: ++ PhysicalNetworks= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNHostName= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 12:35:24 crc kubenswrapper[4921]: ++ ovs_dir=/var/lib/openvswitch Mar 18 12:35:24 crc kubenswrapper[4921]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 12:35:24 crc kubenswrapper[4921]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 12:35:24 crc kubenswrapper[4921]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + cleanup_ovsdb_server_semaphore Mar 18 12:35:24 crc kubenswrapper[4921]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 12:35:24 crc kubenswrapper[4921]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 12:35:24 crc kubenswrapper[4921]: > Mar 18 12:35:24 crc kubenswrapper[4921]: E0318 12:35:24.577557 4921 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 12:35:24 crc kubenswrapper[4921]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 12:35:24 crc kubenswrapper[4921]: + source /usr/local/bin/container-scripts/functions Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNBridge=br-int Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNRemote=tcp:localhost:6642 Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNEncapType=geneve Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNAvailabilityZones= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ EnableChassisAsGateway=true Mar 18 12:35:24 crc kubenswrapper[4921]: ++ PhysicalNetworks= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ OVNHostName= Mar 18 12:35:24 crc kubenswrapper[4921]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 12:35:24 crc kubenswrapper[4921]: ++ ovs_dir=/var/lib/openvswitch Mar 18 12:35:24 crc kubenswrapper[4921]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 12:35:24 crc kubenswrapper[4921]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 12:35:24 crc kubenswrapper[4921]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + sleep 0.5 Mar 18 12:35:24 crc kubenswrapper[4921]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 12:35:24 crc kubenswrapper[4921]: + cleanup_ovsdb_server_semaphore Mar 18 12:35:24 crc kubenswrapper[4921]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 12:35:24 crc kubenswrapper[4921]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 12:35:24 crc kubenswrapper[4921]: > pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" containerID="cri-o://78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.577594 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" containerID="cri-o://78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" gracePeriod=28 Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596760 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovn-rundir\") pod \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596832 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bedf417-75a0-4163-88ee-c11ea02ae1f4-config\") pod \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596863 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstr2\" (UniqueName: \"kubernetes.io/projected/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-kube-api-access-jstr2\") pod \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596922 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-sb\") pod \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596949 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdbserver-nb-tls-certs\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596973 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-nb\") pod \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.596990 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovs-rundir\") pod \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597011 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdb-rundir\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597025 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597063 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-metrics-certs-tls-certs\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597094 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-config\") pod \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597142 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njxbh\" (UniqueName: \"kubernetes.io/projected/4bedf417-75a0-4163-88ee-c11ea02ae1f4-kube-api-access-njxbh\") pod \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597171 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-swift-storage-0\") pod \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-combined-ca-bundle\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597251 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-metrics-certs-tls-certs\") pod \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-config\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597353 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-svc\") pod \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\" (UID: \"66ca2e8c-3d98-4448-afa4-8a278b5f0c54\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597381 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-combined-ca-bundle\") pod \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\" (UID: \"4bedf417-75a0-4163-88ee-c11ea02ae1f4\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597421 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f884h\" (UniqueName: \"kubernetes.io/projected/abf100e1-595d-4dce-9125-c27db7e9408a-kube-api-access-f884h\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.597467 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-scripts\") pod \"abf100e1-595d-4dce-9125-c27db7e9408a\" (UID: \"abf100e1-595d-4dce-9125-c27db7e9408a\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.607470 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-config" (OuterVolumeSpecName: "config") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.611739 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "4bedf417-75a0-4163-88ee-c11ea02ae1f4" (UID: "4bedf417-75a0-4163-88ee-c11ea02ae1f4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.611827 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "4bedf417-75a0-4163-88ee-c11ea02ae1f4" (UID: "4bedf417-75a0-4163-88ee-c11ea02ae1f4"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.613877 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.621577 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-scripts" (OuterVolumeSpecName: "scripts") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.625441 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.634284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bedf417-75a0-4163-88ee-c11ea02ae1f4-config" (OuterVolumeSpecName: "config") pod "4bedf417-75a0-4163-88ee-c11ea02ae1f4" (UID: "4bedf417-75a0-4163-88ee-c11ea02ae1f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.669359 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bedf417-75a0-4163-88ee-c11ea02ae1f4-kube-api-access-njxbh" (OuterVolumeSpecName: "kube-api-access-njxbh") pod "4bedf417-75a0-4163-88ee-c11ea02ae1f4" (UID: "4bedf417-75a0-4163-88ee-c11ea02ae1f4"). InnerVolumeSpecName "kube-api-access-njxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.675580 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-kube-api-access-jstr2" (OuterVolumeSpecName: "kube-api-access-jstr2") pod "66ca2e8c-3d98-4448-afa4-8a278b5f0c54" (UID: "66ca2e8c-3d98-4448-afa4-8a278b5f0c54"). InnerVolumeSpecName "kube-api-access-jstr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.694027 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf100e1-595d-4dce-9125-c27db7e9408a-kube-api-access-f884h" (OuterVolumeSpecName: "kube-api-access-f884h") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "kube-api-access-f884h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.717711 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729509 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729539 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f884h\" (UniqueName: \"kubernetes.io/projected/abf100e1-595d-4dce-9125-c27db7e9408a-kube-api-access-f884h\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729549 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf100e1-595d-4dce-9125-c27db7e9408a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729557 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729611 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bedf417-75a0-4163-88ee-c11ea02ae1f4-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729621 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstr2\" (UniqueName: \"kubernetes.io/projected/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-kube-api-access-jstr2\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729631 4921 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4bedf417-75a0-4163-88ee-c11ea02ae1f4-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729662 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729686 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.729696 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njxbh\" (UniqueName: \"kubernetes.io/projected/4bedf417-75a0-4163-88ee-c11ea02ae1f4-kube-api-access-njxbh\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.830469 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config\") pod \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.830624 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config-secret\") pod \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.830707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-combined-ca-bundle\") pod \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.830734 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlpql\" (UniqueName: \"kubernetes.io/projected/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-kube-api-access-qlpql\") pod \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\" (UID: \"bc025b71-4b10-41e1-bccf-0d67a9b36b0f\") " Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.869356 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-kube-api-access-qlpql" (OuterVolumeSpecName: "kube-api-access-qlpql") pod "bc025b71-4b10-41e1-bccf-0d67a9b36b0f" (UID: "bc025b71-4b10-41e1-bccf-0d67a9b36b0f"). InnerVolumeSpecName "kube-api-access-qlpql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:24 crc kubenswrapper[4921]: E0318 12:35:24.896618 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod870b5852_0790_4d4a_a0c1_df7789287b36.slice/crio-8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2204df50_7907_4d3b_a8b3_5aee222044f2.slice/crio-conmon-0b7785aa69c2d4d5a0513e84fe33227f3ad20c98b78d1dcca6b047589db0a914.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod870b5852_0790_4d4a_a0c1_df7789287b36.slice/crio-conmon-8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf1cfbea_21b1_4f20_95c2_22c07304789c.slice/crio-conmon-ba3db590f738d1f28baa164d30603e96d6981b683ac84eb657fd01aa1145a053.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76380191_f4a9_4690_bb6e_cb85ad794e33.slice/crio-78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76380191_f4a9_4690_bb6e_cb85ad794e33.slice/crio-conmon-78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.933201 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlpql\" (UniqueName: \"kubernetes.io/projected/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-kube-api-access-qlpql\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:24 crc kubenswrapper[4921]: I0318 12:35:24.958436 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.027659 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.041837 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.109720 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.145992 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66ca2e8c-3d98-4448-afa4-8a278b5f0c54" (UID: "66ca2e8c-3d98-4448-afa4-8a278b5f0c54"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154306 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2kmj\" (UniqueName: \"kubernetes.io/projected/3ee31803-52cb-4fcd-8ab1-990b0440a67a-kube-api-access-n2kmj\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154448 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run-ovn\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154524 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-ovn-controller-tls-certs\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154659 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-log-ovn\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154698 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-combined-ca-bundle\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154732 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.154750 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ee31803-52cb-4fcd-8ab1-990b0440a67a-scripts\") pod \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\" (UID: \"3ee31803-52cb-4fcd-8ab1-990b0440a67a\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.155100 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.155132 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.166816 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ee31803-52cb-4fcd-8ab1-990b0440a67a-scripts" (OuterVolumeSpecName: "scripts") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.167649 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.170645 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run" (OuterVolumeSpecName: "var-run") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.170706 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.172937 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc025b71-4b10-41e1-bccf-0d67a9b36b0f" (UID: "bc025b71-4b10-41e1-bccf-0d67a9b36b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.204223 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee31803-52cb-4fcd-8ab1-990b0440a67a-kube-api-access-n2kmj" (OuterVolumeSpecName: "kube-api-access-n2kmj") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "kube-api-access-n2kmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.207772 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bc025b71-4b10-41e1-bccf-0d67a9b36b0f" (UID: "bc025b71-4b10-41e1-bccf-0d67a9b36b0f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.271175 4921 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.271207 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.271220 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ee31803-52cb-4fcd-8ab1-990b0440a67a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.273232 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2kmj\" (UniqueName: \"kubernetes.io/projected/3ee31803-52cb-4fcd-8ab1-990b0440a67a-kube-api-access-n2kmj\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.273261 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.273307 4921 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ee31803-52cb-4fcd-8ab1-990b0440a67a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.273322 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.281381 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19620ae3-0817-4f27-a363-c57b6b7a0a99" path="/var/lib/kubelet/pods/19620ae3-0817-4f27-a363-c57b6b7a0a99/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.281937 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2164592b-7066-42f2-a9f8-f87f2b3eb19e" path="/var/lib/kubelet/pods/2164592b-7066-42f2-a9f8-f87f2b3eb19e/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.282442 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8" path="/var/lib/kubelet/pods/2fc6bf1d-adcc-4c68-97af-49e6a7cbb0e8/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.282896 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405b0dbf-0f32-499a-a71f-9ed82f5eb6a9" path="/var/lib/kubelet/pods/405b0dbf-0f32-499a-a71f-9ed82f5eb6a9/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.284235 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412d3034-b9e8-403d-a293-6c7ed02a7751" path="/var/lib/kubelet/pods/412d3034-b9e8-403d-a293-6c7ed02a7751/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.284784 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f3503b-9ffc-4667-b9f5-9a4880895948" path="/var/lib/kubelet/pods/85f3503b-9ffc-4667-b9f5-9a4880895948/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.285676 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865b5e1d-7747-4cc6-b9fb-e65784799085" path="/var/lib/kubelet/pods/865b5e1d-7747-4cc6-b9fb-e65784799085/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.286796 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f16f79-7171-482d-b31a-0a204980fbf6" path="/var/lib/kubelet/pods/97f16f79-7171-482d-b31a-0a204980fbf6/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.287330 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6306e6-563d-4511-ba58-e685c1c9a599" path="/var/lib/kubelet/pods/9b6306e6-563d-4511-ba58-e685c1c9a599/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.293475 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb" path="/var/lib/kubelet/pods/9c2b474d-ae52-4dc6-aecd-2a38dc6b26eb/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.294357 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d8c16c-799a-4946-80d8-5db5566e76c9" path="/var/lib/kubelet/pods/b6d8c16c-799a-4946-80d8-5db5566e76c9/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.295258 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7908371-b4b8-4437-be4a-13b8fccb6a9f" path="/var/lib/kubelet/pods/b7908371-b4b8-4437-be4a-13b8fccb6a9f/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.296767 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00f5d00-73a8-4268-acc9-49f809cf6d7f" path="/var/lib/kubelet/pods/c00f5d00-73a8-4268-acc9-49f809cf6d7f/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.301382 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9" path="/var/lib/kubelet/pods/c4c7bfce-1fe1-4edc-af60-a7b5c36e5ab9/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.302665 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a6de4f-07c3-4217-956b-22b24f366220" path="/var/lib/kubelet/pods/c7a6de4f-07c3-4217-956b-22b24f366220/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.303663 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6989392-6285-4c0d-80e4-3d2d30461e4f" path="/var/lib/kubelet/pods/f6989392-6285-4c0d-80e4-3d2d30461e4f/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.311802 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff14077c-4ea8-4a45-925c-0f83af75745c" path="/var/lib/kubelet/pods/ff14077c-4ea8-4a45-925c-0f83af75745c/volumes" Mar 18 12:35:25 crc kubenswrapper[4921]: E0318 12:35:25.375594 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 12:35:25 crc kubenswrapper[4921]: E0318 12:35:25.375669 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data podName:df692663-cc58-4cf1-a05b-566e0152ee90 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:29.375651238 +0000 UTC m=+1548.925571877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data") pod "rabbitmq-server-0" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90") : configmap "rabbitmq-config-data" not found Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.405220 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bedf417-75a0-4163-88ee-c11ea02ae1f4" (UID: "4bedf417-75a0-4163-88ee-c11ea02ae1f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.417383 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66ca2e8c-3d98-4448-afa4-8a278b5f0c54" (UID: "66ca2e8c-3d98-4448-afa4-8a278b5f0c54"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.426918 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "66ca2e8c-3d98-4448-afa4-8a278b5f0c54" (UID: "66ca2e8c-3d98-4448-afa4-8a278b5f0c54"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.445932 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.471343 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.479252 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.482209 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.482341 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.482445 4921 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.482515 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.517035 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p6tzr_4bedf417-75a0-4163-88ee-c11ea02ae1f4/openstack-network-exporter/0.log" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.517284 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p6tzr" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.520680 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66ca2e8c-3d98-4448-afa4-8a278b5f0c54" (UID: "66ca2e8c-3d98-4448-afa4-8a278b5f0c54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.535972 4921 generic.go:334] "Generic (PLEG): container finished" podID="08667791-7c42-46d1-a74b-436dfefa5db3" containerID="199e0bb74b18c94897363ce6c49342390238bfcfe306768c0912bdc03eeb27b4" exitCode=143 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.559415 4921 generic.go:334] "Generic (PLEG): container finished" podID="870b5852-0790-4d4a-a0c1-df7789287b36" containerID="8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.559444 4921 generic.go:334] "Generic (PLEG): container finished" podID="870b5852-0790-4d4a-a0c1-df7789287b36" containerID="f7216c7a86cd828cfa2701494b171fae9e8224c04fb41a1372f7e7cb90e5cf3a" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.563616 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bc025b71-4b10-41e1-bccf-0d67a9b36b0f" (UID: "bc025b71-4b10-41e1-bccf-0d67a9b36b0f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.569254 4921 generic.go:334] "Generic (PLEG): container finished" podID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerID="e282050c14e51eefd65b0a5667448f8285cab09cc7f7c0ec5267fa01ddcbb423" exitCode=143 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.584614 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.584640 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc025b71-4b10-41e1-bccf-0d67a9b36b0f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.587318 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf100e1-595d-4dce-9125-c27db7e9408a/ovsdbserver-nb/0.log" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.587579 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.601055 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "3ee31803-52cb-4fcd-8ab1-990b0440a67a" (UID: "3ee31803-52cb-4fcd-8ab1-990b0440a67a"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.607390 4921 generic.go:334] "Generic (PLEG): container finished" podID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.610100 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-mrt7s" podStartSLOduration=4.6100679719999995 podStartE2EDuration="4.610067972s" podCreationTimestamp="2026-03-18 12:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:25.601979603 +0000 UTC m=+1545.151900252" watchObservedRunningTime="2026-03-18 12:35:25.610067972 +0000 UTC m=+1545.159988611" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.621275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4bedf417-75a0-4163-88ee-c11ea02ae1f4" (UID: "4bedf417-75a0-4163-88ee-c11ea02ae1f4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.625612 4921 generic.go:334] "Generic (PLEG): container finished" podID="18566d04-485b-411a-a1b8-e761a7fa6933" containerID="d769e05c72dbc3bfb0249264660c0d34a67576be10d78c4d0f42227432990ac5" exitCode=143 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.628200 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-config" (OuterVolumeSpecName: "config") pod "66ca2e8c-3d98-4448-afa4-8a278b5f0c54" (UID: "66ca2e8c-3d98-4448-afa4-8a278b5f0c54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.630336 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerID="ba3db590f738d1f28baa164d30603e96d6981b683ac84eb657fd01aa1145a053" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.630368 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerID="b6f6eac2f1ae64a2931ebab6341ccdde8403127e4d28bf10d8ffd9bcbe4a8ea3" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.632101 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.637300 4921 generic.go:334] "Generic (PLEG): container finished" podID="b032b317-787f-4f39-bf12-aff187fb862f" containerID="f0d4324420151331450e6d095a9a2716913bafe1a8e2ec54e2ee054bd54bde23" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.643544 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e1887f67-2204-45c0-be34-9471594f217c/ovsdbserver-sb/0.log" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.643621 4921 generic.go:334] "Generic (PLEG): container finished" podID="e1887f67-2204-45c0-be34-9471594f217c" containerID="4eb6151343cc2ec88716d0980c3d56c2d49d23288a278634cdfaa260afdf18bd" exitCode=143 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.646676 4921 generic.go:334] "Generic (PLEG): container finished" podID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerID="a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9" exitCode=143 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.649470 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.680670 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "abf100e1-595d-4dce-9125-c27db7e9408a" (UID: "abf100e1-595d-4dce-9125-c27db7e9408a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.690768 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf100e1-595d-4dce-9125-c27db7e9408a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.690800 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ca2e8c-3d98-4448-afa4-8a278b5f0c54-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.690813 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee31803-52cb-4fcd-8ab1-990b0440a67a-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.690824 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bedf417-75a0-4163-88ee-c11ea02ae1f4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.696485 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="d92a64abd1a53e46dcafdafcc8d4d1c74904044d5ba50721426f103d435d57d1" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.696527 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="6fa2fcae87945f7dd516860dd504658f8b9dc554af18972aba630feda6408da7" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.697663 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="0b7785aa69c2d4d5a0513e84fe33227f3ad20c98b78d1dcca6b047589db0a914" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.712835 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-djh4f" Mar 18 12:35:25 crc kubenswrapper[4921]: E0318 12:35:25.792624 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:25 crc kubenswrapper[4921]: E0318 12:35:25.792697 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data podName:ef935990-b291-43b7-9d56-673b7b05a7a7 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:29.792683695 +0000 UTC m=+1549.342604334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data") pod "rabbitmq-cell1-server-0" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7") : configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893245 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p6tzr" event={"ID":"4bedf417-75a0-4163-88ee-c11ea02ae1f4","Type":"ContainerDied","Data":"b132a8633b6483dfdf7634ec61b2e3ac778375792c88eaea256d55b8ffed8e4e"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf79cb9db-9pf9t" event={"ID":"08667791-7c42-46d1-a74b-436dfefa5db3","Type":"ContainerDied","Data":"199e0bb74b18c94897363ce6c49342390238bfcfe306768c0912bdc03eeb27b4"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" event={"ID":"870b5852-0790-4d4a-a0c1-df7789287b36","Type":"ContainerDied","Data":"8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" event={"ID":"870b5852-0790-4d4a-a0c1-df7789287b36","Type":"ContainerDied","Data":"f7216c7a86cd828cfa2701494b171fae9e8224c04fb41a1372f7e7cb90e5cf3a"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893353 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" event={"ID":"870b5852-0790-4d4a-a0c1-df7789287b36","Type":"ContainerDied","Data":"99e18b3ef98f53aa6c585067bbad4419a8c051b09b8e3f0ba8c5bd0ad16cdcf5"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893366 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e18b3ef98f53aa6c585067bbad4419a8c051b09b8e3f0ba8c5bd0ad16cdcf5" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893385 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"778f8baf-82ce-457d-b32d-35d3abe1a79d","Type":"ContainerDied","Data":"e282050c14e51eefd65b0a5667448f8285cab09cc7f7c0ec5267fa01ddcbb423"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mrt7s" event={"ID":"6fb2928a-2c88-4045-bc13-a7dca96f9639","Type":"ContainerStarted","Data":"9baac04c9c1b71a48e09955446eb2be6ec58220298201762700babc62bbb9036"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893411 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mrt7s" event={"ID":"6fb2928a-2c88-4045-bc13-a7dca96f9639","Type":"ContainerStarted","Data":"520eb97ba82a0bd746169ea808839074daf3537c9c81e1fd9f67c30656b6f82f"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf100e1-595d-4dce-9125-c27db7e9408a","Type":"ContainerDied","Data":"fb29aa2ecd303f3ad9fa4bb4e91b68eff97431f3aadc4531a48c4ee4624b1a8d"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893519 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerDied","Data":"78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893532 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" event={"ID":"18566d04-485b-411a-a1b8-e761a7fa6933","Type":"ContainerDied","Data":"d769e05c72dbc3bfb0249264660c0d34a67576be10d78c4d0f42227432990ac5"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bf1cfbea-21b1-4f20-95c2-22c07304789c","Type":"ContainerDied","Data":"ba3db590f738d1f28baa164d30603e96d6981b683ac84eb657fd01aa1145a053"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bf1cfbea-21b1-4f20-95c2-22c07304789c","Type":"ContainerDied","Data":"b6f6eac2f1ae64a2931ebab6341ccdde8403127e4d28bf10d8ffd9bcbe4a8ea3"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893581 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b032b317-787f-4f39-bf12-aff187fb862f","Type":"ContainerDied","Data":"f0d4324420151331450e6d095a9a2716913bafe1a8e2ec54e2ee054bd54bde23"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893600 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1887f67-2204-45c0-be34-9471594f217c","Type":"ContainerDied","Data":"4eb6151343cc2ec88716d0980c3d56c2d49d23288a278634cdfaa260afdf18bd"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e1887f67-2204-45c0-be34-9471594f217c","Type":"ContainerDied","Data":"0b397ad591533ebe1ec56706eddfe15e2df40f430c1302166d0264f1e64f0548"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893626 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b397ad591533ebe1ec56706eddfe15e2df40f430c1302166d0264f1e64f0548" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893637 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-689c8956b9-wzd7n" event={"ID":"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a","Type":"ContainerDied","Data":"a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893650 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-fbcg4" event={"ID":"66ca2e8c-3d98-4448-afa4-8a278b5f0c54","Type":"ContainerDied","Data":"1ed230a204f4e56c3255eae9face8bf1b5eab134a84132b30b3f8b192561fd0a"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"d92a64abd1a53e46dcafdafcc8d4d1c74904044d5ba50721426f103d435d57d1"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893681 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"6fa2fcae87945f7dd516860dd504658f8b9dc554af18972aba630feda6408da7"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893691 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"0b7785aa69c2d4d5a0513e84fe33227f3ad20c98b78d1dcca6b047589db0a914"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.893703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-djh4f" event={"ID":"3ee31803-52cb-4fcd-8ab1-990b0440a67a","Type":"ContainerDied","Data":"9bf619b0c98e4a96d465d46d66874a7f13971b32f242ce3c41cbaf1d94033161"} Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.898234 4921 scope.go:117] "RemoveContainer" containerID="f8e01516efcfac1821ca3c35e166cc122d12a03ad3ebfd843f89a0549e5ee5c2" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.915520 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e1887f67-2204-45c0-be34-9471594f217c/ovsdbserver-sb/0.log" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.915598 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.996860 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-metrics-certs-tls-certs\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.996913 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-ovsdbserver-sb-tls-certs\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.996986 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1887f67-2204-45c0-be34-9471594f217c-ovsdb-rundir\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.997518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1887f67-2204-45c0-be34-9471594f217c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.997607 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldnl6\" (UniqueName: \"kubernetes.io/projected/e1887f67-2204-45c0-be34-9471594f217c-kube-api-access-ldnl6\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.997633 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-scripts\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.997984 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.998247 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-config\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.998292 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-combined-ca-bundle\") pod \"e1887f67-2204-45c0-be34-9471594f217c\" (UID: \"e1887f67-2204-45c0-be34-9471594f217c\") " Mar 18 12:35:25 crc kubenswrapper[4921]: I0318 12:35:25.998676 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1887f67-2204-45c0-be34-9471594f217c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.000208 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-scripts" (OuterVolumeSpecName: "scripts") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.001034 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-config" (OuterVolumeSpecName: "config") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.054762 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1887f67-2204-45c0-be34-9471594f217c-kube-api-access-ldnl6" (OuterVolumeSpecName: "kube-api-access-ldnl6") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "kube-api-access-ldnl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.056811 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.059970 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.066527 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.074360 4921 scope.go:117] "RemoveContainer" containerID="208313c91671ca6bab23b1fd323268c213a12034b1cf7b7e4deee0560bc9b81b" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.095450 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101478 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data-custom\") pod \"bf1cfbea-21b1-4f20-95c2-22c07304789c\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101577 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/bf1cfbea-21b1-4f20-95c2-22c07304789c-kube-api-access-zgnmt\") pod \"bf1cfbea-21b1-4f20-95c2-22c07304789c\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101604 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-scripts\") pod \"bf1cfbea-21b1-4f20-95c2-22c07304789c\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101623 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-config-data\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101644 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-internal-tls-certs\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101683 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xpf6\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-kube-api-access-5xpf6\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101705 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-log-httpd\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101730 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-combined-ca-bundle\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101750 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf1cfbea-21b1-4f20-95c2-22c07304789c-etc-machine-id\") pod \"bf1cfbea-21b1-4f20-95c2-22c07304789c\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101771 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-etc-swift\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101795 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-public-tls-certs\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101813 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data\") pod \"bf1cfbea-21b1-4f20-95c2-22c07304789c\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101846 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-combined-ca-bundle\") pod \"bf1cfbea-21b1-4f20-95c2-22c07304789c\" (UID: \"bf1cfbea-21b1-4f20-95c2-22c07304789c\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.101863 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-run-httpd\") pod \"870b5852-0790-4d4a-a0c1-df7789287b36\" (UID: \"870b5852-0790-4d4a-a0c1-df7789287b36\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.102147 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldnl6\" (UniqueName: \"kubernetes.io/projected/e1887f67-2204-45c0-be34-9471594f217c-kube-api-access-ldnl6\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.102165 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.102188 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.102199 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1887f67-2204-45c0-be34-9471594f217c-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.102209 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.104206 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.119621 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf1cfbea-21b1-4f20-95c2-22c07304789c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bf1cfbea-21b1-4f20-95c2-22c07304789c" (UID: "bf1cfbea-21b1-4f20-95c2-22c07304789c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.120792 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p6tzr"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.130618 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bf1cfbea-21b1-4f20-95c2-22c07304789c" (UID: "bf1cfbea-21b1-4f20-95c2-22c07304789c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.135779 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.140296 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-kube-api-access-5xpf6" (OuterVolumeSpecName: "kube-api-access-5xpf6") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "kube-api-access-5xpf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.148611 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-p6tzr"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.155964 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.170007 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1cfbea-21b1-4f20-95c2-22c07304789c-kube-api-access-zgnmt" (OuterVolumeSpecName: "kube-api-access-zgnmt") pod "bf1cfbea-21b1-4f20-95c2-22c07304789c" (UID: "bf1cfbea-21b1-4f20-95c2-22c07304789c"). InnerVolumeSpecName "kube-api-access-zgnmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.171301 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-scripts" (OuterVolumeSpecName: "scripts") pod "bf1cfbea-21b1-4f20-95c2-22c07304789c" (UID: "bf1cfbea-21b1-4f20-95c2-22c07304789c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.180460 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-fbcg4"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.188322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203306 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf1cfbea-21b1-4f20-95c2-22c07304789c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203329 4921 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203339 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203348 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203355 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203363 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgnmt\" (UniqueName: \"kubernetes.io/projected/bf1cfbea-21b1-4f20-95c2-22c07304789c-kube-api-access-zgnmt\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203370 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203378 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xpf6\" (UniqueName: \"kubernetes.io/projected/870b5852-0790-4d4a-a0c1-df7789287b36-kube-api-access-5xpf6\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.203386 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/870b5852-0790-4d4a-a0c1-df7789287b36-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.211275 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-fbcg4"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.216357 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.273261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.306614 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.306648 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.307034 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-djh4f"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.317072 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-djh4f"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.355286 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-config-data" (OuterVolumeSpecName: "config-data") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.358347 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.372094 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.410603 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.410695 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf1cfbea-21b1-4f20-95c2-22c07304789c" (UID: "bf1cfbea-21b1-4f20-95c2-22c07304789c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.412082 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.412133 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.412147 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.478139 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.486344 4921 scope.go:117] "RemoveContainer" containerID="e3189baa795f31117342258600e4a54fc7b7135f65ae4d258dd6f3c6c51a0740" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.489087 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.494385 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e1887f67-2204-45c0-be34-9471594f217c" (UID: "e1887f67-2204-45c0-be34-9471594f217c"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.510338 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "870b5852-0790-4d4a-a0c1-df7789287b36" (UID: "870b5852-0790-4d4a-a0c1-df7789287b36"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.518012 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/870b5852-0790-4d4a-a0c1-df7789287b36-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.518076 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1887f67-2204-45c0-be34-9471594f217c-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.521191 4921 scope.go:117] "RemoveContainer" containerID="5a943b023196d78dd7c6404da7d91670ba6f169429e0a23456b547a01a996a57" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.549831 4921 scope.go:117] "RemoveContainer" containerID="696963e76f2c9834064e1cc263a38d695183d851da7f00d1ff0c9af0d196e352" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.555953 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data" (OuterVolumeSpecName: "config-data") pod "bf1cfbea-21b1-4f20-95c2-22c07304789c" (UID: "bf1cfbea-21b1-4f20-95c2-22c07304789c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.568840 4921 scope.go:117] "RemoveContainer" containerID="da6d212c129e736460b40210e84be01f919a959e7de91a56b27199b3820ed71d" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.591358 4921 scope.go:117] "RemoveContainer" containerID="91ea66d0192bbffc94a048cad7e99c8827ba926ea7b1b2762795c9d6c5b354a9" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619336 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-operator-scripts\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619504 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-vencrypt-tls-certs\") pod \"b032b317-787f-4f39-bf12-aff187fb862f\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619566 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-galera-tls-certs\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619626 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbm2b\" (UniqueName: \"kubernetes.io/projected/b032b317-787f-4f39-bf12-aff187fb862f-kube-api-access-cbm2b\") pod \"b032b317-787f-4f39-bf12-aff187fb862f\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619672 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5lgr\" (UniqueName: \"kubernetes.io/projected/f3456852-8fb7-4e40-81d1-f3ba06088f81-kube-api-access-c5lgr\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-kolla-config\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619789 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619879 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-combined-ca-bundle\") pod \"b032b317-787f-4f39-bf12-aff187fb862f\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619920 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-config-data\") pod \"b032b317-787f-4f39-bf12-aff187fb862f\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.619970 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-combined-ca-bundle\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.620074 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-nova-novncproxy-tls-certs\") pod \"b032b317-787f-4f39-bf12-aff187fb862f\" (UID: \"b032b317-787f-4f39-bf12-aff187fb862f\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.620099 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-generated\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.622433 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.622715 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.624202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-default\") pod \"f3456852-8fb7-4e40-81d1-f3ba06088f81\" (UID: \"f3456852-8fb7-4e40-81d1-f3ba06088f81\") " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.625320 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.625439 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.625452 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.625461 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1cfbea-21b1-4f20-95c2-22c07304789c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.626460 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.627607 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3456852-8fb7-4e40-81d1-f3ba06088f81-kube-api-access-c5lgr" (OuterVolumeSpecName: "kube-api-access-c5lgr") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "kube-api-access-c5lgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.632604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b032b317-787f-4f39-bf12-aff187fb862f-kube-api-access-cbm2b" (OuterVolumeSpecName: "kube-api-access-cbm2b") pod "b032b317-787f-4f39-bf12-aff187fb862f" (UID: "b032b317-787f-4f39-bf12-aff187fb862f"). InnerVolumeSpecName "kube-api-access-cbm2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.643285 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.668707 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-config-data" (OuterVolumeSpecName: "config-data") pod "b032b317-787f-4f39-bf12-aff187fb862f" (UID: "b032b317-787f-4f39-bf12-aff187fb862f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.680207 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b032b317-787f-4f39-bf12-aff187fb862f" (UID: "b032b317-787f-4f39-bf12-aff187fb862f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.687250 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "b032b317-787f-4f39-bf12-aff187fb862f" (UID: "b032b317-787f-4f39-bf12-aff187fb862f"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.690354 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "b032b317-787f-4f39-bf12-aff187fb862f" (UID: "b032b317-787f-4f39-bf12-aff187fb862f"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.697387 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.707928 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3456852-8fb7-4e40-81d1-f3ba06088f81" (UID: "f3456852-8fb7-4e40-81d1-f3ba06088f81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726543 4921 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726570 4921 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726579 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbm2b\" (UniqueName: \"kubernetes.io/projected/b032b317-787f-4f39-bf12-aff187fb862f-kube-api-access-cbm2b\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726588 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5lgr\" (UniqueName: \"kubernetes.io/projected/f3456852-8fb7-4e40-81d1-f3ba06088f81-kube-api-access-c5lgr\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726596 4921 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726615 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726625 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726635 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726630 4921 generic.go:334] "Generic (PLEG): container finished" podID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerID="39e6a8d2a389dad40dfed60f96860330eb4be0119c8e89e962a17c2926a11993" exitCode=0 Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726644 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3456852-8fb7-4e40-81d1-f3ba06088f81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726696 4921 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b032b317-787f-4f39-bf12-aff187fb862f-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726707 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3456852-8fb7-4e40-81d1-f3ba06088f81-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.726665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86b7dc884f-42l8h" event={"ID":"a688bd96-47e0-4ae4-8e94-3c44f964b9e0","Type":"ContainerDied","Data":"39e6a8d2a389dad40dfed60f96860330eb4be0119c8e89e962a17c2926a11993"} Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.735600 4921 generic.go:334] "Generic (PLEG): container finished" podID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerID="40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277" exitCode=0 Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.735644 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.735683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f3456852-8fb7-4e40-81d1-f3ba06088f81","Type":"ContainerDied","Data":"40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277"} Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.735740 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f3456852-8fb7-4e40-81d1-f3ba06088f81","Type":"ContainerDied","Data":"6f79d6ec4502a0d208ebe7a83131b01164475c00630991f1a1ce404632962d1f"} Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.735757 4921 scope.go:117] "RemoveContainer" containerID="40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.741563 4921 generic.go:334] "Generic (PLEG): container finished" podID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerID="9baac04c9c1b71a48e09955446eb2be6ec58220298201762700babc62bbb9036" exitCode=1 Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.741618 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mrt7s" event={"ID":"6fb2928a-2c88-4045-bc13-a7dca96f9639","Type":"ContainerDied","Data":"9baac04c9c1b71a48e09955446eb2be6ec58220298201762700babc62bbb9036"} Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.742170 4921 scope.go:117] "RemoveContainer" containerID="9baac04c9c1b71a48e09955446eb2be6ec58220298201762700babc62bbb9036" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.749082 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.750065 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"bf1cfbea-21b1-4f20-95c2-22c07304789c","Type":"ContainerDied","Data":"20e2d01249d508dac8a4060bfbea3c12d079d255dbbda323862516c8c4613cb0"} Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.750084 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.760897 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b032b317-787f-4f39-bf12-aff187fb862f","Type":"ContainerDied","Data":"0500708b258ca4763f923aea09cf242fd64c6ed3598052fb75fcb177f3402a1f"} Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.760941 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.760919 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.785850 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f8c57d85f-gdqrq" Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.789355 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.789894 4921 scope.go:117] "RemoveContainer" containerID="e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d" Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.791093 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.794730 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.800265 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.800817 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.800854 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.801051 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.801075 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.839186 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.872304 4921 scope.go:117] "RemoveContainer" containerID="40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277" Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.875422 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277\": container with ID starting with 40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277 not found: ID does not exist" containerID="40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.875498 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277"} err="failed to get container status \"40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277\": rpc error: code = NotFound desc = could not find container \"40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277\": container with ID starting with 40e759694ef16fc9d681a6267295ac10c47868a504d2d8bd63aaeec8bee52277 not found: ID does not exist" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.875549 4921 scope.go:117] "RemoveContainer" containerID="e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d" Mar 18 12:35:26 crc kubenswrapper[4921]: E0318 12:35:26.884745 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d\": container with ID starting with e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d not found: ID does not exist" containerID="e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.884880 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d"} err="failed to get container status \"e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d\": rpc error: code = NotFound desc = could not find container \"e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d\": container with ID starting with e664b9693da91cd1a41a6b84b980093cbff46441e4301ee17b81458d1087c99d not found: ID does not exist" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.884997 4921 scope.go:117] "RemoveContainer" containerID="ba3db590f738d1f28baa164d30603e96d6981b683ac84eb657fd01aa1145a053" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.891529 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.903423 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.909538 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.930572 4921 scope.go:117] "RemoveContainer" containerID="b6f6eac2f1ae64a2931ebab6341ccdde8403127e4d28bf10d8ffd9bcbe4a8ea3" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.945457 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.953259 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.958974 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.968284 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.977599 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.986542 4921 scope.go:117] "RemoveContainer" containerID="f0d4324420151331450e6d095a9a2716913bafe1a8e2ec54e2ee054bd54bde23" Mar 18 12:35:26 crc kubenswrapper[4921]: I0318 12:35:26.993367 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f8c57d85f-gdqrq"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.008019 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7f8c57d85f-gdqrq"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.219292 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" path="/var/lib/kubelet/pods/3ee31803-52cb-4fcd-8ab1-990b0440a67a/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.220205 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bedf417-75a0-4163-88ee-c11ea02ae1f4" path="/var/lib/kubelet/pods/4bedf417-75a0-4163-88ee-c11ea02ae1f4/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.220724 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" path="/var/lib/kubelet/pods/66ca2e8c-3d98-4448-afa4-8a278b5f0c54/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.221973 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" path="/var/lib/kubelet/pods/870b5852-0790-4d4a-a0c1-df7789287b36/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.222757 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" path="/var/lib/kubelet/pods/abf100e1-595d-4dce-9125-c27db7e9408a/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.232306 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b032b317-787f-4f39-bf12-aff187fb862f" path="/var/lib/kubelet/pods/b032b317-787f-4f39-bf12-aff187fb862f/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.232951 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc025b71-4b10-41e1-bccf-0d67a9b36b0f" path="/var/lib/kubelet/pods/bc025b71-4b10-41e1-bccf-0d67a9b36b0f/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.233614 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" path="/var/lib/kubelet/pods/bf1cfbea-21b1-4f20-95c2-22c07304789c/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.235275 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1887f67-2204-45c0-be34-9471594f217c" path="/var/lib/kubelet/pods/e1887f67-2204-45c0-be34-9471594f217c/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.236081 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" path="/var/lib/kubelet/pods/f3456852-8fb7-4e40-81d1-f3ba06088f81/volumes" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.296002 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.296337 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-central-agent" containerID="cri-o://a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.296482 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="proxy-httpd" containerID="cri-o://cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.296531 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="sg-core" containerID="cri-o://61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.296561 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-notification-agent" containerID="cri-o://ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.333994 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.334220 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1414d026-b9f7-4fb9-ae37-0de669bf759f" containerName="kube-state-metrics" containerID="cri-o://dfc603010f57b8628b4ea1ee256d5bb53992fc758a2b5b05d3543c3947d66b31" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.476194 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4f7a-account-create-update-xn88l"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.495164 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.495397 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="cf2f76d2-7d0e-450c-8218-0cf40e03cbee" containerName="memcached" containerID="cri-o://d1d72c66ce5a1eb5ac5faf2d0a921bafbdff4044fe05e8e42218253919f68d86" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.521215 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4f7a-account-create-update-xn88l"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.541318 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sthxw"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.559475 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sthxw"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.578321 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-fqckd"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.600897 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4f7a-account-create-update-6bjmm"] Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601343 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerName="init" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601362 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerName="init" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601381 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerName="dnsmasq-dns" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601387 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerName="dnsmasq-dns" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601397 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="ovsdbserver-sb" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601403 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="ovsdbserver-sb" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601415 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerName="galera" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601421 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerName="galera" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601430 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="probe" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601437 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="probe" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601447 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601453 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601465 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bedf417-75a0-4163-88ee-c11ea02ae1f4" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601471 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bedf417-75a0-4163-88ee-c11ea02ae1f4" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601482 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-server" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601487 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-server" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601495 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b032b317-787f-4f39-bf12-aff187fb862f" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601501 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b032b317-787f-4f39-bf12-aff187fb862f" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601511 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerName="mysql-bootstrap" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601517 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerName="mysql-bootstrap" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601525 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-httpd" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601531 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-httpd" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601543 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="ovsdbserver-nb" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601549 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="ovsdbserver-nb" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601556 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601561 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601569 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601575 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.601589 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="cinder-scheduler" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601596 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="cinder-scheduler" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601746 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-server" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601759 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601767 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bedf417-75a0-4163-88ee-c11ea02ae1f4" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601775 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1887f67-2204-45c0-be34-9471594f217c" containerName="ovsdbserver-sb" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601785 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="870b5852-0790-4d4a-a0c1-df7789287b36" containerName="proxy-httpd" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601800 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3456852-8fb7-4e40-81d1-f3ba06088f81" containerName="galera" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601810 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ca2e8c-3d98-4448-afa4-8a278b5f0c54" containerName="dnsmasq-dns" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601818 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="ovsdbserver-nb" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601828 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee31803-52cb-4fcd-8ab1-990b0440a67a" containerName="ovn-controller" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601837 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf100e1-595d-4dce-9125-c27db7e9408a" containerName="openstack-network-exporter" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601847 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="cinder-scheduler" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601856 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1cfbea-21b1-4f20-95c2-22c07304789c" containerName="probe" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.601862 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b032b317-787f-4f39-bf12-aff187fb862f" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.602554 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.605645 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.619314 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-fqckd"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.630170 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4f7a-account-create-update-6bjmm"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.642012 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.650009 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56b6658ccd-lzk2m"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.650395 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-56b6658ccd-lzk2m" podUID="5e6d9230-4481-43b3-891b-066a3bc6a46f" containerName="keystone-api" containerID="cri-o://9cae82dc6d9adf77b41d7fa82acd4e2e90f6f5263c54efe2afe1b22b3ad1b371" gracePeriod=30 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.726873 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mrt7s"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.736862 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jdm4q"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.744655 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jdm4q"] Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.753954 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4f7a-account-create-update-6bjmm"] Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.754958 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-kxpr8 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-4f7a-account-create-update-6bjmm" podUID="41428169-d582-4ffb-9c81-fec1b99d13a3" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.785769 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxpr8\" (UniqueName: \"kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.785826 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.879759 4921 generic.go:334] "Generic (PLEG): container finished" podID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerID="61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033" exitCode=2 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.879879 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerDied","Data":"61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033"} Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.887295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.887448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpr8\" (UniqueName: \"kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.887778 4921 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.887819 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts podName:41428169-d582-4ffb-9c81-fec1b99d13a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:28.387805524 +0000 UTC m=+1547.937726163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts") pod "keystone-4f7a-account-create-update-6bjmm" (UID: "41428169-d582-4ffb-9c81-fec1b99d13a3") : configmap "openstack-scripts" not found Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.895767 4921 projected.go:194] Error preparing data for projected volume kube-api-access-kxpr8 for pod openstack/keystone-4f7a-account-create-update-6bjmm: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.895850 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8 podName:41428169-d582-4ffb-9c81-fec1b99d13a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:28.395830432 +0000 UTC m=+1547.945751071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpr8" (UniqueName: "kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8") pod "keystone-4f7a-account-create-update-6bjmm" (UID: "41428169-d582-4ffb-9c81-fec1b99d13a3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.910228 4921 generic.go:334] "Generic (PLEG): container finished" podID="1414d026-b9f7-4fb9-ae37-0de669bf759f" containerID="dfc603010f57b8628b4ea1ee256d5bb53992fc758a2b5b05d3543c3947d66b31" exitCode=2 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.910281 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1414d026-b9f7-4fb9-ae37-0de669bf759f","Type":"ContainerDied","Data":"dfc603010f57b8628b4ea1ee256d5bb53992fc758a2b5b05d3543c3947d66b31"} Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.911972 4921 generic.go:334] "Generic (PLEG): container finished" podID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerID="db9a0c1811b401c4f15334f71054371c07cdb06d6ce632ece0b7217159b060b3" exitCode=0 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.912011 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf","Type":"ContainerDied","Data":"db9a0c1811b401c4f15334f71054371c07cdb06d6ce632ece0b7217159b060b3"} Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.951654 4921 generic.go:334] "Generic (PLEG): container finished" podID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerID="1a2e4314370ce5c4fae41b6b33bc781b07c039542465afe1cc609a0b0992990b" exitCode=0 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.951732 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3271455-7c85-4b68-a27f-fb648ae6abc9","Type":"ContainerDied","Data":"1a2e4314370ce5c4fae41b6b33bc781b07c039542465afe1cc609a0b0992990b"} Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.971384 4921 generic.go:334] "Generic (PLEG): container finished" podID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerID="404fbdad0acbc101609fe321e0ebd443a1518297cc849df46edbfb574ac4328a" exitCode=0 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.971490 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"778f8baf-82ce-457d-b32d-35d3abe1a79d","Type":"ContainerDied","Data":"404fbdad0acbc101609fe321e0ebd443a1518297cc849df46edbfb574ac4328a"} Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.996217 4921 generic.go:334] "Generic (PLEG): container finished" podID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerID="de3a1c8ff3cba548217ab37c3ea36a19802e3bb6be2f9d8686c8c04103576451" exitCode=1 Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.996594 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mrt7s" event={"ID":"6fb2928a-2c88-4045-bc13-a7dca96f9639","Type":"ContainerDied","Data":"de3a1c8ff3cba548217ab37c3ea36a19802e3bb6be2f9d8686c8c04103576451"} Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.996701 4921 scope.go:117] "RemoveContainer" containerID="9baac04c9c1b71a48e09955446eb2be6ec58220298201762700babc62bbb9036" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.996792 4921 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-mrt7s" secret="" err="secret \"galera-openstack-dockercfg-b5xqq\" not found" Mar 18 12:35:27 crc kubenswrapper[4921]: I0318 12:35:27.996857 4921 scope.go:117] "RemoveContainer" containerID="de3a1c8ff3cba548217ab37c3ea36a19802e3bb6be2f9d8686c8c04103576451" Mar 18 12:35:27 crc kubenswrapper[4921]: E0318 12:35:27.997170 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-mrt7s_openstack(6fb2928a-2c88-4045-bc13-a7dca96f9639)\"" pod="openstack/root-account-create-update-mrt7s" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.000476 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.014350 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.017300 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cf79cb9db-9pf9t" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:40340->10.217.0.162:9311: read: connection reset by peer" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.017735 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cf79cb9db-9pf9t" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:40336->10.217.0.162:9311: read: connection reset by peer" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.090906 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": read tcp 10.217.0.2:50666->10.217.0.166:8776: read: connection reset by peer" Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.099182 4921 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.099239 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts podName:6fb2928a-2c88-4045-bc13-a7dca96f9639 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:28.599224695 +0000 UTC m=+1548.149145334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts") pod "root-account-create-update-mrt7s" (UID: "6fb2928a-2c88-4045-bc13-a7dca96f9639") : configmap "openstack-scripts" not found Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.167692 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="galera" containerID="cri-o://df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff" gracePeriod=30 Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.388261 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b is running failed: container process not found" containerID="3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.396163 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b is running failed: container process not found" containerID="3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.401227 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b is running failed: container process not found" containerID="3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.401292 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="1a183a61-e314-4bd0-b332-3d216d70c6c2" containerName="nova-cell1-conductor-conductor" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.415220 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.415436 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpr8\" (UniqueName: \"kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.416665 4921 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.416710 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts podName:41428169-d582-4ffb-9c81-fec1b99d13a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:29.416698967 +0000 UTC m=+1548.966619606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts") pod "keystone-4f7a-account-create-update-6bjmm" (UID: "41428169-d582-4ffb-9c81-fec1b99d13a3") : configmap "openstack-scripts" not found Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.421501 4921 projected.go:194] Error preparing data for projected volume kube-api-access-kxpr8 for pod openstack/keystone-4f7a-account-create-update-6bjmm: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.421949 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8 podName:41428169-d582-4ffb-9c81-fec1b99d13a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:29.421764301 +0000 UTC m=+1548.971685030 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpr8" (UniqueName: "kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8") pod "keystone-4f7a-account-create-update-6bjmm" (UID: "41428169-d582-4ffb-9c81-fec1b99d13a3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.613843 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.620641 4921 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.620693 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts podName:6fb2928a-2c88-4045-bc13-a7dca96f9639 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:29.620679727 +0000 UTC m=+1549.170600376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts") pod "root-account-create-update-mrt7s" (UID: "6fb2928a-2c88-4045-bc13-a7dca96f9639") : configmap "openstack-scripts" not found Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.646067 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.677421 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:3000/\": read tcp 10.217.0.2:34126->10.217.0.205:3000: read: connection reset by peer" Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.680319 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce is running failed: container process not found" containerID="fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.680872 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce is running failed: container process not found" containerID="fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.681843 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce is running failed: container process not found" containerID="fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:35:28 crc kubenswrapper[4921]: E0318 12:35:28.681878 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f3fdc858-ca78-4137-b287-a3015e80b660" containerName="nova-scheduler-scheduler" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.688409 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.704375 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.721539 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-config-data\") pod \"a3271455-7c85-4b68-a27f-fb648ae6abc9\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.721617 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3271455-7c85-4b68-a27f-fb648ae6abc9-logs\") pod \"a3271455-7c85-4b68-a27f-fb648ae6abc9\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.721639 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-internal-tls-certs\") pod \"a3271455-7c85-4b68-a27f-fb648ae6abc9\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.721713 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srhkt\" (UniqueName: \"kubernetes.io/projected/a3271455-7c85-4b68-a27f-fb648ae6abc9-kube-api-access-srhkt\") pod \"a3271455-7c85-4b68-a27f-fb648ae6abc9\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.721734 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-public-tls-certs\") pod \"a3271455-7c85-4b68-a27f-fb648ae6abc9\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.721877 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-combined-ca-bundle\") pod \"a3271455-7c85-4b68-a27f-fb648ae6abc9\" (UID: \"a3271455-7c85-4b68-a27f-fb648ae6abc9\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.725294 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3271455-7c85-4b68-a27f-fb648ae6abc9-logs" (OuterVolumeSpecName: "logs") pod "a3271455-7c85-4b68-a27f-fb648ae6abc9" (UID: "a3271455-7c85-4b68-a27f-fb648ae6abc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.729960 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3271455-7c85-4b68-a27f-fb648ae6abc9-kube-api-access-srhkt" (OuterVolumeSpecName: "kube-api-access-srhkt") pod "a3271455-7c85-4b68-a27f-fb648ae6abc9" (UID: "a3271455-7c85-4b68-a27f-fb648ae6abc9"). InnerVolumeSpecName "kube-api-access-srhkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.795374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a3271455-7c85-4b68-a27f-fb648ae6abc9" (UID: "a3271455-7c85-4b68-a27f-fb648ae6abc9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.795716 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3271455-7c85-4b68-a27f-fb648ae6abc9" (UID: "a3271455-7c85-4b68-a27f-fb648ae6abc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.796448 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-config-data" (OuterVolumeSpecName: "config-data") pod "a3271455-7c85-4b68-a27f-fb648ae6abc9" (UID: "a3271455-7c85-4b68-a27f-fb648ae6abc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.802066 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.814437 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a3271455-7c85-4b68-a27f-fb648ae6abc9" (UID: "a3271455-7c85-4b68-a27f-fb648ae6abc9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.824085 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-internal-tls-certs\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.824422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data-custom\") pod \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.824566 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-combined-ca-bundle\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.824732 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-logs\") pod \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.824872 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-scripts\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.825140 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-logs\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.825295 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-config-data\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.825422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-certs\") pod \"1414d026-b9f7-4fb9-ae37-0de669bf759f\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.825575 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-combined-ca-bundle\") pod \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.825710 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-config\") pod \"1414d026-b9f7-4fb9-ae37-0de669bf759f\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.825915 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-httpd-run\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.826059 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.826175 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5blf\" (UniqueName: \"kubernetes.io/projected/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-api-access-t5blf\") pod \"1414d026-b9f7-4fb9-ae37-0de669bf759f\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.826283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rsgv\" (UniqueName: \"kubernetes.io/projected/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-kube-api-access-2rsgv\") pod \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\" (UID: \"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.826404 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data\") pod \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.826558 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9ffm\" (UniqueName: \"kubernetes.io/projected/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-kube-api-access-j9ffm\") pod \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\" (UID: \"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.826699 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-combined-ca-bundle\") pod \"1414d026-b9f7-4fb9-ae37-0de669bf759f\" (UID: \"1414d026-b9f7-4fb9-ae37-0de669bf759f\") " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.827401 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.827600 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.827680 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3271455-7c85-4b68-a27f-fb648ae6abc9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.827752 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.827863 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srhkt\" (UniqueName: \"kubernetes.io/projected/a3271455-7c85-4b68-a27f-fb648ae6abc9-kube-api-access-srhkt\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.827938 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3271455-7c85-4b68-a27f-fb648ae6abc9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.841143 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-logs" (OuterVolumeSpecName: "logs") pod "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" (UID: "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.846438 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.847657 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-logs" (OuterVolumeSpecName: "logs") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.867885 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" (UID: "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.872710 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.874349 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-kube-api-access-j9ffm" (OuterVolumeSpecName: "kube-api-access-j9ffm") pod "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" (UID: "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a"). InnerVolumeSpecName "kube-api-access-j9ffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.874597 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-kube-api-access-2rsgv" (OuterVolumeSpecName: "kube-api-access-2rsgv") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "kube-api-access-2rsgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.875154 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-api-access-t5blf" (OuterVolumeSpecName: "kube-api-access-t5blf") pod "1414d026-b9f7-4fb9-ae37-0de669bf759f" (UID: "1414d026-b9f7-4fb9-ae37-0de669bf759f"). InnerVolumeSpecName "kube-api-access-t5blf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.876799 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-scripts" (OuterVolumeSpecName: "scripts") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.928275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1414d026-b9f7-4fb9-ae37-0de669bf759f" (UID: "1414d026-b9f7-4fb9-ae37-0de669bf759f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929835 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929869 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929883 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929896 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929906 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929934 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929946 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5blf\" (UniqueName: \"kubernetes.io/projected/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-api-access-t5blf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929959 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rsgv\" (UniqueName: \"kubernetes.io/projected/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-kube-api-access-2rsgv\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929969 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9ffm\" (UniqueName: \"kubernetes.io/projected/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-kube-api-access-j9ffm\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.929979 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:28 crc kubenswrapper[4921]: I0318 12:35:28.964949 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.026662 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.027643 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.033279 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3271455-7c85-4b68-a27f-fb648ae6abc9","Type":"ContainerDied","Data":"c642a6e5f87422babb22afbb7db7db58bf22249deba47d2736696b09d658952f"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.033343 4921 scope.go:117] "RemoveContainer" containerID="1a2e4314370ce5c4fae41b6b33bc781b07c039542465afe1cc609a0b0992990b" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.033495 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.034937 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.034979 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.046717 4921 generic.go:334] "Generic (PLEG): container finished" podID="f3fdc858-ca78-4137-b287-a3015e80b660" containerID="fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.046793 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3fdc858-ca78-4137-b287-a3015e80b660","Type":"ContainerDied","Data":"fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.066641 4921 generic.go:334] "Generic (PLEG): container finished" podID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerID="d0425a7018f55d15fcb50cbcccfeff3feea433294ce78fd911a563de3145ed79" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.066714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b574f65d-9f59-41e3-bec6-59c25cc847fe","Type":"ContainerDied","Data":"d0425a7018f55d15fcb50cbcccfeff3feea433294ce78fd911a563de3145ed79"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.068196 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.087559 4921 generic.go:334] "Generic (PLEG): container finished" podID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerID="2ad13996e9b7e948e5782e01e1bad550f76a615ba18352b010ea6a536fbadc36" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.087644 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3270a214-054c-4c39-aedc-9ba6fb58a7ae","Type":"ContainerDied","Data":"2ad13996e9b7e948e5782e01e1bad550f76a615ba18352b010ea6a536fbadc36"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.095026 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.095384 4921 generic.go:334] "Generic (PLEG): container finished" podID="cf2f76d2-7d0e-450c-8218-0cf40e03cbee" containerID="d1d72c66ce5a1eb5ac5faf2d0a921bafbdff4044fe05e8e42218253919f68d86" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.095453 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf2f76d2-7d0e-450c-8218-0cf40e03cbee","Type":"ContainerDied","Data":"d1d72c66ce5a1eb5ac5faf2d0a921bafbdff4044fe05e8e42218253919f68d86"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.098092 4921 generic.go:334] "Generic (PLEG): container finished" podID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerID="cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.098161 4921 generic.go:334] "Generic (PLEG): container finished" podID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerID="a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.098229 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerDied","Data":"cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.098255 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerDied","Data":"a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.101467 4921 generic.go:334] "Generic (PLEG): container finished" podID="18566d04-485b-411a-a1b8-e761a7fa6933" containerID="d3111e0bf89186d6ceed4c9cbb069267d0c8685ca3cf5793ac787eddf6cf6018" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.101539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" event={"ID":"18566d04-485b-411a-a1b8-e761a7fa6933","Type":"ContainerDied","Data":"d3111e0bf89186d6ceed4c9cbb069267d0c8685ca3cf5793ac787eddf6cf6018"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.103752 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "1414d026-b9f7-4fb9-ae37-0de669bf759f" (UID: "1414d026-b9f7-4fb9-ae37-0de669bf759f"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.112397 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.112674 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1414d026-b9f7-4fb9-ae37-0de669bf759f","Type":"ContainerDied","Data":"58bd963f7c92608cc2e299bca05474fee38d53522984eac94e66beabdc1ea3d7"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.118025 4921 scope.go:117] "RemoveContainer" containerID="2aace202e5bc4e616801b05d9c08062b31861e303a33d0aee12e11730dc18d7e" Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.124641 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.129407 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.129864 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf","Type":"ContainerDied","Data":"380a8908656a348eaf0d21e91db7964db5cd144c4269305029bc41611a373e8f"} Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.133381 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.134156 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" (UID: "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.135351 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.135903 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-httpd-run\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.135989 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-public-tls-certs\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136106 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136177 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-scripts\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136263 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9dt\" (UniqueName: \"kubernetes.io/projected/778f8baf-82ce-457d-b32d-35d3abe1a79d-kube-api-access-5j9dt\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136284 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-combined-ca-bundle\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136331 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-logs\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136370 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-config-data\") pod \"778f8baf-82ce-457d-b32d-35d3abe1a79d\" (UID: \"778f8baf-82ce-457d-b32d-35d3abe1a79d\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136803 4921 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.136819 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.142836 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.142911 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" containerName="nova-cell0-conductor-conductor" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.144251 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-logs" (OuterVolumeSpecName: "logs") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.146501 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.148775 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.151220 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778f8baf-82ce-457d-b32d-35d3abe1a79d-kube-api-access-5j9dt" (OuterVolumeSpecName: "kube-api-access-5j9dt") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "kube-api-access-5j9dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.152419 4921 generic.go:334] "Generic (PLEG): container finished" podID="08667791-7c42-46d1-a74b-436dfefa5db3" containerID="cfefeafefb675bac464d8262a5b628863032fde78483031396ecf5c2c726f1af" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.152538 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf79cb9db-9pf9t" event={"ID":"08667791-7c42-46d1-a74b-436dfefa5db3","Type":"ContainerDied","Data":"cfefeafefb675bac464d8262a5b628863032fde78483031396ecf5c2c726f1af"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.152806 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cf79cb9db-9pf9t" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.160091 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-scripts" (OuterVolumeSpecName: "scripts") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.161623 4921 generic.go:334] "Generic (PLEG): container finished" podID="1a183a61-e314-4bd0-b332-3d216d70c6c2" containerID="3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.161755 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1a183a61-e314-4bd0-b332-3d216d70c6c2","Type":"ContainerDied","Data":"3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.163758 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.165848 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.168033 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"778f8baf-82ce-457d-b32d-35d3abe1a79d","Type":"ContainerDied","Data":"106bdca30eb650df18be845da90fb4c97ae19396f0dffcbef1e8c42102f9c63c"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.168557 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.168812 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-config-data" (OuterVolumeSpecName: "config-data") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.175319 4921 generic.go:334] "Generic (PLEG): container finished" podID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerID="21996dae0b2690b430cf03b884c33f3ef56bfe6e6623a7ddd63437c6d50e1ff5" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.175426 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc469779b-2mfpc" event={"ID":"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71","Type":"ContainerDied","Data":"21996dae0b2690b430cf03b884c33f3ef56bfe6e6623a7ddd63437c6d50e1ff5"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.183976 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "1414d026-b9f7-4fb9-ae37-0de669bf759f" (UID: "1414d026-b9f7-4fb9-ae37-0de669bf759f"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.188791 4921 generic.go:334] "Generic (PLEG): container finished" podID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerID="59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632" exitCode=0 Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.188875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-689c8956b9-wzd7n" event={"ID":"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a","Type":"ContainerDied","Data":"59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.188908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-689c8956b9-wzd7n" event={"ID":"cc7ad6a4-5d48-43b9-a85c-da13a0beed6a","Type":"ContainerDied","Data":"e1368e5fad63bd0bf40d1f93b49d2521faa2b23cb0717c5d6788756c91967068"} Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.189277 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-689c8956b9-wzd7n" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.203348 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.204779 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" (UID: "7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.213410 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-config-data" (OuterVolumeSpecName: "config-data") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.225670 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8872fccc-7e47-4ab7-8b31-c81b93fc72de" path="/var/lib/kubelet/pods/8872fccc-7e47-4ab7-8b31-c81b93fc72de/volumes" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.226405 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" path="/var/lib/kubelet/pods/a3271455-7c85-4b68-a27f-fb648ae6abc9/volumes" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.227097 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa03737-632a-4574-a31a-93b05e3be3f0" path="/var/lib/kubelet/pods/aaa03737-632a-4574-a31a-93b05e3be3f0/volumes" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.229550 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dceba139-8e2e-4533-b22c-08d898ffadb5" path="/var/lib/kubelet/pods/dceba139-8e2e-4533-b22c-08d898ffadb5/volumes" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.230496 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec467fe9-ec1a-4b58-a0b0-b745e4c41f69" path="/var/lib/kubelet/pods/ec467fe9-ec1a-4b58-a0b0-b745e4c41f69/volumes" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239346 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-nova-metadata-tls-certs\") pod \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239406 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvrw\" (UniqueName: \"kubernetes.io/projected/3270a214-054c-4c39-aedc-9ba6fb58a7ae-kube-api-access-4jvrw\") pod \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239436 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3270a214-054c-4c39-aedc-9ba6fb58a7ae-logs\") pod \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239477 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-public-tls-certs\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239549 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-combined-ca-bundle\") pod \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data-custom\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239619 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08667791-7c42-46d1-a74b-436dfefa5db3-logs\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239645 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239721 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-internal-tls-certs\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239760 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4rsr\" (UniqueName: \"kubernetes.io/projected/08667791-7c42-46d1-a74b-436dfefa5db3-kube-api-access-k4rsr\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239781 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-combined-ca-bundle\") pod \"08667791-7c42-46d1-a74b-436dfefa5db3\" (UID: \"08667791-7c42-46d1-a74b-436dfefa5db3\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.239827 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-config-data\") pod \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\" (UID: \"3270a214-054c-4c39-aedc-9ba6fb58a7ae\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240422 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240445 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240457 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240470 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9dt\" (UniqueName: \"kubernetes.io/projected/778f8baf-82ce-457d-b32d-35d3abe1a79d-kube-api-access-5j9dt\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240482 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240493 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240503 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240515 4921 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1414d026-b9f7-4fb9-ae37-0de669bf759f-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240525 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/778f8baf-82ce-457d-b32d-35d3abe1a79d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240897 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3270a214-054c-4c39-aedc-9ba6fb58a7ae-logs" (OuterVolumeSpecName: "logs") pod "3270a214-054c-4c39-aedc-9ba6fb58a7ae" (UID: "3270a214-054c-4c39-aedc-9ba6fb58a7ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.240947 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08667791-7c42-46d1-a74b-436dfefa5db3-logs" (OuterVolumeSpecName: "logs") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.248486 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3270a214-054c-4c39-aedc-9ba6fb58a7ae-kube-api-access-4jvrw" (OuterVolumeSpecName: "kube-api-access-4jvrw") pod "3270a214-054c-4c39-aedc-9ba6fb58a7ae" (UID: "3270a214-054c-4c39-aedc-9ba6fb58a7ae"). InnerVolumeSpecName "kube-api-access-4jvrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.251826 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.251906 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08667791-7c42-46d1-a74b-436dfefa5db3-kube-api-access-k4rsr" (OuterVolumeSpecName: "kube-api-access-k4rsr") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "kube-api-access-k4rsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.258365 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.277257 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.287322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data" (OuterVolumeSpecName: "config-data") pod "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" (UID: "cc7ad6a4-5d48-43b9-a85c-da13a0beed6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.288309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "778f8baf-82ce-457d-b32d-35d3abe1a79d" (UID: "778f8baf-82ce-457d-b32d-35d3abe1a79d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.293263 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3270a214-054c-4c39-aedc-9ba6fb58a7ae" (UID: "3270a214-054c-4c39-aedc-9ba6fb58a7ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.334461 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-config-data" (OuterVolumeSpecName: "config-data") pod "3270a214-054c-4c39-aedc-9ba6fb58a7ae" (UID: "3270a214-054c-4c39-aedc-9ba6fb58a7ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344141 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344174 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4rsr\" (UniqueName: \"kubernetes.io/projected/08667791-7c42-46d1-a74b-436dfefa5db3-kube-api-access-k4rsr\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344187 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344198 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344209 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvrw\" (UniqueName: \"kubernetes.io/projected/3270a214-054c-4c39-aedc-9ba6fb58a7ae-kube-api-access-4jvrw\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344220 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3270a214-054c-4c39-aedc-9ba6fb58a7ae-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344231 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/778f8baf-82ce-457d-b32d-35d3abe1a79d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344244 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344254 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344265 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.344275 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08667791-7c42-46d1-a74b-436dfefa5db3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.350864 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data" (OuterVolumeSpecName: "config-data") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.354082 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.385392 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.403161 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3270a214-054c-4c39-aedc-9ba6fb58a7ae" (UID: "3270a214-054c-4c39-aedc-9ba6fb58a7ae"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.406452 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "08667791-7c42-46d1-a74b-436dfefa5db3" (UID: "08667791-7c42-46d1-a74b-436dfefa5db3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxpr8\" (UniqueName: \"kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446362 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts\") pod \"keystone-4f7a-account-create-update-6bjmm\" (UID: \"41428169-d582-4ffb-9c81-fec1b99d13a3\") " pod="openstack/keystone-4f7a-account-create-update-6bjmm" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446522 4921 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3270a214-054c-4c39-aedc-9ba6fb58a7ae-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446537 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446546 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446558 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.446569 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08667791-7c42-46d1-a74b-436dfefa5db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.446648 4921 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.446704 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts podName:41428169-d582-4ffb-9c81-fec1b99d13a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:31.446687132 +0000 UTC m=+1550.996607771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts") pod "keystone-4f7a-account-create-update-6bjmm" (UID: "41428169-d582-4ffb-9c81-fec1b99d13a3") : configmap "openstack-scripts" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.447070 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.447131 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data podName:df692663-cc58-4cf1-a05b-566e0152ee90 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:37.447102393 +0000 UTC m=+1556.997023032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data") pod "rabbitmq-server-0" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90") : configmap "rabbitmq-config-data" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.452050 4921 projected.go:194] Error preparing data for projected volume kube-api-access-kxpr8 for pod openstack/keystone-4f7a-account-create-update-6bjmm: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.452155 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8 podName:41428169-d582-4ffb-9c81-fec1b99d13a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:31.452135076 +0000 UTC m=+1551.002055715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kxpr8" (UniqueName: "kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8") pod "keystone-4f7a-account-create-update-6bjmm" (UID: "41428169-d582-4ffb-9c81-fec1b99d13a3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.476935 4921 scope.go:117] "RemoveContainer" containerID="2ad13996e9b7e948e5782e01e1bad550f76a615ba18352b010ea6a536fbadc36" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.499808 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.500389 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.508951 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.520342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.537027 4921 scope.go:117] "RemoveContainer" containerID="82561021b876e5fc67fb9e3d59a4200ce1a39a49783bf35f15eda35ba9e8e7a5" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.563514 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.567937 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.592890 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4f7a-account-create-update-6bjmm"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.608651 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4f7a-account-create-update-6bjmm"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.611195 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.616105 4921 scope.go:117] "RemoveContainer" containerID="dfc603010f57b8628b4ea1ee256d5bb53992fc758a2b5b05d3543c3947d66b31" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.634819 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.639358 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.654073 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.658262 4921 scope.go:117] "RemoveContainer" containerID="db9a0c1811b401c4f15334f71054371c07cdb06d6ce632ece0b7217159b060b3" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.673628 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-internal-tls-certs\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.673901 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-config-data\") pod \"1a183a61-e314-4bd0-b332-3d216d70c6c2\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.673958 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-combined-ca-bundle\") pod \"18566d04-485b-411a-a1b8-e761a7fa6933\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.673990 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data-custom\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.674476 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b574f65d-9f59-41e3-bec6-59c25cc847fe-logs\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.674531 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-internal-tls-certs\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.674562 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmknc\" (UniqueName: \"kubernetes.io/projected/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-kube-api-access-hmknc\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.675212 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b574f65d-9f59-41e3-bec6-59c25cc847fe-logs" (OuterVolumeSpecName: "logs") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.675737 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-public-tls-certs\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.675780 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-public-tls-certs\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.675803 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b574f65d-9f59-41e3-bec6-59c25cc847fe-etc-machine-id\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.675892 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b574f65d-9f59-41e3-bec6-59c25cc847fe-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.678752 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data-custom\") pod \"18566d04-485b-411a-a1b8-e761a7fa6933\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.678869 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-combined-ca-bundle\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.678897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-scripts\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.678926 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data\") pod \"18566d04-485b-411a-a1b8-e761a7fa6933\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.678949 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ph46\" (UniqueName: \"kubernetes.io/projected/18566d04-485b-411a-a1b8-e761a7fa6933-kube-api-access-4ph46\") pod \"18566d04-485b-411a-a1b8-e761a7fa6933\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.678976 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgvnj\" (UniqueName: \"kubernetes.io/projected/1a183a61-e314-4bd0-b332-3d216d70c6c2-kube-api-access-zgvnj\") pod \"1a183a61-e314-4bd0-b332-3d216d70c6c2\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679026 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7g94\" (UniqueName: \"kubernetes.io/projected/b574f65d-9f59-41e3-bec6-59c25cc847fe-kube-api-access-t7g94\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679062 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18566d04-485b-411a-a1b8-e761a7fa6933-logs\") pod \"18566d04-485b-411a-a1b8-e761a7fa6933\" (UID: \"18566d04-485b-411a-a1b8-e761a7fa6933\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679097 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-config-data\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679166 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-scripts\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679198 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-combined-ca-bundle\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679225 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-combined-ca-bundle\") pod \"1a183a61-e314-4bd0-b332-3d216d70c6c2\" (UID: \"1a183a61-e314-4bd0-b332-3d216d70c6c2\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679268 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-logs\") pod \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\" (UID: \"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.679290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data\") pod \"b574f65d-9f59-41e3-bec6-59c25cc847fe\" (UID: \"b574f65d-9f59-41e3-bec6-59c25cc847fe\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.689829 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b574f65d-9f59-41e3-bec6-59c25cc847fe-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.689967 4921 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.690038 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts podName:6fb2928a-2c88-4045-bc13-a7dca96f9639 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:31.690015459 +0000 UTC m=+1551.239936098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts") pod "root-account-create-update-mrt7s" (UID: "6fb2928a-2c88-4045-bc13-a7dca96f9639") : configmap "openstack-scripts" not found Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.690647 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "18566d04-485b-411a-a1b8-e761a7fa6933" (UID: "18566d04-485b-411a-a1b8-e761a7fa6933"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.692161 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.692599 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18566d04-485b-411a-a1b8-e761a7fa6933-logs" (OuterVolumeSpecName: "logs") pod "18566d04-485b-411a-a1b8-e761a7fa6933" (UID: "18566d04-485b-411a-a1b8-e761a7fa6933"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.698097 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-logs" (OuterVolumeSpecName: "logs") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.698164 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cf79cb9db-9pf9t"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.718845 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-cf79cb9db-9pf9t"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.725726 4921 scope.go:117] "RemoveContainer" containerID="ed60a39c31c959721c7809f4880462dd279124305bcfe7b9ac91a6dcce1bd85e" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.732468 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.739971 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.742464 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-kube-api-access-hmknc" (OuterVolumeSpecName: "kube-api-access-hmknc") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "kube-api-access-hmknc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.744081 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a183a61-e314-4bd0-b332-3d216d70c6c2-kube-api-access-zgvnj" (OuterVolumeSpecName: "kube-api-access-zgvnj") pod "1a183a61-e314-4bd0-b332-3d216d70c6c2" (UID: "1a183a61-e314-4bd0-b332-3d216d70c6c2"). InnerVolumeSpecName "kube-api-access-zgvnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.768306 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-scripts" (OuterVolumeSpecName: "scripts") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.768362 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18566d04-485b-411a-a1b8-e761a7fa6933-kube-api-access-4ph46" (OuterVolumeSpecName: "kube-api-access-4ph46") pod "18566d04-485b-411a-a1b8-e761a7fa6933" (UID: "18566d04-485b-411a-a1b8-e761a7fa6933"). InnerVolumeSpecName "kube-api-access-4ph46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.768364 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-scripts" (OuterVolumeSpecName: "scripts") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.770901 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b574f65d-9f59-41e3-bec6-59c25cc847fe-kube-api-access-t7g94" (OuterVolumeSpecName: "kube-api-access-t7g94") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "kube-api-access-t7g94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.781876 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.787890 4921 scope.go:117] "RemoveContainer" containerID="cfefeafefb675bac464d8262a5b628863032fde78483031396ecf5c2c726f1af" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792343 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klm8k\" (UniqueName: \"kubernetes.io/projected/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kube-api-access-klm8k\") pod \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792482 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kolla-config\") pod \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792573 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gphsf\" (UniqueName: \"kubernetes.io/projected/f3fdc858-ca78-4137-b287-a3015e80b660-kube-api-access-gphsf\") pod \"f3fdc858-ca78-4137-b287-a3015e80b660\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792606 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-config-data\") pod \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792763 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-memcached-tls-certs\") pod \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792830 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-config-data\") pod \"f3fdc858-ca78-4137-b287-a3015e80b660\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.792968 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-combined-ca-bundle\") pod \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\" (UID: \"cf2f76d2-7d0e-450c-8218-0cf40e03cbee\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793007 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-combined-ca-bundle\") pod \"f3fdc858-ca78-4137-b287-a3015e80b660\" (UID: \"f3fdc858-ca78-4137-b287-a3015e80b660\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793486 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b574f65d-9f59-41e3-bec6-59c25cc847fe-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793506 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793547 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793562 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ph46\" (UniqueName: \"kubernetes.io/projected/18566d04-485b-411a-a1b8-e761a7fa6933-kube-api-access-4ph46\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793575 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgvnj\" (UniqueName: \"kubernetes.io/projected/1a183a61-e314-4bd0-b332-3d216d70c6c2-kube-api-access-zgvnj\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793587 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxpr8\" (UniqueName: \"kubernetes.io/projected/41428169-d582-4ffb-9c81-fec1b99d13a3-kube-api-access-kxpr8\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793598 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41428169-d582-4ffb-9c81-fec1b99d13a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793608 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7g94\" (UniqueName: \"kubernetes.io/projected/b574f65d-9f59-41e3-bec6-59c25cc847fe-kube-api-access-t7g94\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793618 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18566d04-485b-411a-a1b8-e761a7fa6933-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793629 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793639 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793649 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793659 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793668 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmknc\" (UniqueName: \"kubernetes.io/projected/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-kube-api-access-hmknc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.793752 4921 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:29 crc kubenswrapper[4921]: E0318 12:35:29.793799 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data podName:ef935990-b291-43b7-9d56-673b7b05a7a7 nodeName:}" failed. No retries permitted until 2026-03-18 12:35:37.793781414 +0000 UTC m=+1557.343702063 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data") pod "rabbitmq-cell1-server-0" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7") : configmap "rabbitmq-cell1-config-data" not found Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.793806 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-config-data" (OuterVolumeSpecName: "config-data") pod "cf2f76d2-7d0e-450c-8218-0cf40e03cbee" (UID: "cf2f76d2-7d0e-450c-8218-0cf40e03cbee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.794433 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cf2f76d2-7d0e-450c-8218-0cf40e03cbee" (UID: "cf2f76d2-7d0e-450c-8218-0cf40e03cbee"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.794899 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-689c8956b9-wzd7n"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.795745 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-config-data" (OuterVolumeSpecName: "config-data") pod "1a183a61-e314-4bd0-b332-3d216d70c6c2" (UID: "1a183a61-e314-4bd0-b332-3d216d70c6c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.798745 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kube-api-access-klm8k" (OuterVolumeSpecName: "kube-api-access-klm8k") pod "cf2f76d2-7d0e-450c-8218-0cf40e03cbee" (UID: "cf2f76d2-7d0e-450c-8218-0cf40e03cbee"). InnerVolumeSpecName "kube-api-access-klm8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.801048 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-689c8956b9-wzd7n"] Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.820049 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fdc858-ca78-4137-b287-a3015e80b660-kube-api-access-gphsf" (OuterVolumeSpecName: "kube-api-access-gphsf") pod "f3fdc858-ca78-4137-b287-a3015e80b660" (UID: "f3fdc858-ca78-4137-b287-a3015e80b660"). InnerVolumeSpecName "kube-api-access-gphsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.826494 4921 scope.go:117] "RemoveContainer" containerID="199e0bb74b18c94897363ce6c49342390238bfcfe306768c0912bdc03eeb27b4" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.829433 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a183a61-e314-4bd0-b332-3d216d70c6c2" (UID: "1a183a61-e314-4bd0-b332-3d216d70c6c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.837914 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-config-data" (OuterVolumeSpecName: "config-data") pod "f3fdc858-ca78-4137-b287-a3015e80b660" (UID: "f3fdc858-ca78-4137-b287-a3015e80b660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.863174 4921 scope.go:117] "RemoveContainer" containerID="404fbdad0acbc101609fe321e0ebd443a1518297cc849df46edbfb574ac4328a" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.863178 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18566d04-485b-411a-a1b8-e761a7fa6933" (UID: "18566d04-485b-411a-a1b8-e761a7fa6933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.866869 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894375 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts\") pod \"6fb2928a-2c88-4045-bc13-a7dca96f9639\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894508 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpthc\" (UniqueName: \"kubernetes.io/projected/6fb2928a-2c88-4045-bc13-a7dca96f9639-kube-api-access-gpthc\") pod \"6fb2928a-2c88-4045-bc13-a7dca96f9639\" (UID: \"6fb2928a-2c88-4045-bc13-a7dca96f9639\") " Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894798 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894816 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klm8k\" (UniqueName: \"kubernetes.io/projected/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kube-api-access-klm8k\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894826 4921 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894835 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gphsf\" (UniqueName: \"kubernetes.io/projected/f3fdc858-ca78-4137-b287-a3015e80b660-kube-api-access-gphsf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894845 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894855 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894864 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894875 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a183a61-e314-4bd0-b332-3d216d70c6c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.894906 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fb2928a-2c88-4045-bc13-a7dca96f9639" (UID: "6fb2928a-2c88-4045-bc13-a7dca96f9639"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.896762 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.902670 4921 scope.go:117] "RemoveContainer" containerID="e282050c14e51eefd65b0a5667448f8285cab09cc7f7c0ec5267fa01ddcbb423" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.904784 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb2928a-2c88-4045-bc13-a7dca96f9639-kube-api-access-gpthc" (OuterVolumeSpecName: "kube-api-access-gpthc") pod "6fb2928a-2c88-4045-bc13-a7dca96f9639" (UID: "6fb2928a-2c88-4045-bc13-a7dca96f9639"). InnerVolumeSpecName "kube-api-access-gpthc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.908360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.919057 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf2f76d2-7d0e-450c-8218-0cf40e03cbee" (UID: "cf2f76d2-7d0e-450c-8218-0cf40e03cbee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.921647 4921 scope.go:117] "RemoveContainer" containerID="59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.924596 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3fdc858-ca78-4137-b287-a3015e80b660" (UID: "f3fdc858-ca78-4137-b287-a3015e80b660"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.935908 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.938703 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data" (OuterVolumeSpecName: "config-data") pod "b574f65d-9f59-41e3-bec6-59c25cc847fe" (UID: "b574f65d-9f59-41e3-bec6-59c25cc847fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.968374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data" (OuterVolumeSpecName: "config-data") pod "18566d04-485b-411a-a1b8-e761a7fa6933" (UID: "18566d04-485b-411a-a1b8-e761a7fa6933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.976334 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-config-data" (OuterVolumeSpecName: "config-data") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.998041 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.998072 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18566d04-485b-411a-a1b8-e761a7fa6933-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.998085 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpthc\" (UniqueName: \"kubernetes.io/projected/6fb2928a-2c88-4045-bc13-a7dca96f9639-kube-api-access-gpthc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.998094 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4921]: I0318 12:35:29.998103 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.017182 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.017226 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fb2928a-2c88-4045-bc13-a7dca96f9639-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.017235 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.017245 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b574f65d-9f59-41e3-bec6-59c25cc847fe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.017255 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fdc858-ca78-4137-b287-a3015e80b660-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.005472 4921 scope.go:117] "RemoveContainer" containerID="a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9" Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.000088 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.036742 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.048346 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.049908 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" (UID: "7c908fc5-8a5a-4d0e-8bf3-2b9964319d71"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.051206 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "cf2f76d2-7d0e-450c-8218-0cf40e03cbee" (UID: "cf2f76d2-7d0e-450c-8218-0cf40e03cbee"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.063703 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.064623 4921 scope.go:117] "RemoveContainer" containerID="59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632" Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.067472 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.067833 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="galera" Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.068935 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.073669 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632\": container with ID starting with 59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632 not found: ID does not exist" containerID="59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.073895 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632"} err="failed to get container status \"59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632\": rpc error: code = NotFound desc = could not find container \"59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632\": container with ID starting with 59d9260e97a011e3c75aa728fa57c2dece8afb53fa26e754fd2fc3b22d80f632 not found: ID does not exist" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.073992 4921 scope.go:117] "RemoveContainer" containerID="a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9" Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.074377 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9\": container with ID starting with a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9 not found: ID does not exist" containerID="a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.074454 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9"} err="failed to get container status \"a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9\": rpc error: code = NotFound desc = could not find container \"a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9\": container with ID starting with a2d8996b9956c64436023bc0c3f951c00affb3b4a3306be9176de842c0ad06c9 not found: ID does not exist" Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.083798 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 12:35:30 crc kubenswrapper[4921]: E0318 12:35:30.084013 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="ovn-northd" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.123425 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.123462 4921 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2f76d2-7d0e-450c-8218-0cf40e03cbee-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.123476 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.219707 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3fdc858-ca78-4137-b287-a3015e80b660","Type":"ContainerDied","Data":"a57862f2693d20053806fe91152e94fa5450a604dbe305a8611bf85f26240235"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.219757 4921 scope.go:117] "RemoveContainer" containerID="fe44c22f03c44affae40e253f0612669de9abd78058b75bfda970547cc3455ce" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.219930 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.227416 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mrt7s" event={"ID":"6fb2928a-2c88-4045-bc13-a7dca96f9639","Type":"ContainerDied","Data":"520eb97ba82a0bd746169ea808839074daf3537c9c81e1fd9f67c30656b6f82f"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.227523 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mrt7s" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.234048 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b574f65d-9f59-41e3-bec6-59c25cc847fe","Type":"ContainerDied","Data":"784a1a51473fdd66b5391393d29a4e8b6dce82d46b9a9ae4fa0f022321ea4264"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.235230 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.248874 4921 generic.go:334] "Generic (PLEG): container finished" podID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerID="df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff" exitCode=0 Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.248945 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb6e4980-bd4a-455e-924b-739cee9587c9","Type":"ContainerDied","Data":"df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.250736 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3270a214-054c-4c39-aedc-9ba6fb58a7ae","Type":"ContainerDied","Data":"8d691d3627853ed96d50f458350cd23c33e81c02c1e9fb8f9ba5e0f25b6571cd"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.250821 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.254561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc469779b-2mfpc" event={"ID":"7c908fc5-8a5a-4d0e-8bf3-2b9964319d71","Type":"ContainerDied","Data":"7193354b788eb0607bbcde9142f899d1d30bcc2a2eaea982b5243974db236330"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.254664 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc469779b-2mfpc" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.267328 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cf2f76d2-7d0e-450c-8218-0cf40e03cbee","Type":"ContainerDied","Data":"75c695c0563603d0230e0dd6504fe9dc600caf3168e14edce241b97f2128826d"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.267333 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.286285 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.287817 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1a183a61-e314-4bd0-b332-3d216d70c6c2","Type":"ContainerDied","Data":"2a02e55df0f22299348136d542d6bb374bcee9d78d5782a167168731d958a271"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.304106 4921 scope.go:117] "RemoveContainer" containerID="de3a1c8ff3cba548217ab37c3ea36a19802e3bb6be2f9d8686c8c04103576451" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.312666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" event={"ID":"18566d04-485b-411a-a1b8-e761a7fa6933","Type":"ContainerDied","Data":"0e13cacfeb90b9a196839bf99eb26021ed4bdb56e3f74be9a8465811c11456ef"} Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.312780 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c47f5f4db-j6swx" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.314368 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cc469779b-2mfpc"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.324276 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6cc469779b-2mfpc"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.342331 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.342398 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.347692 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.366243 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.375241 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mrt7s"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.399238 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mrt7s"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.410078 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.411753 4921 scope.go:117] "RemoveContainer" containerID="d0425a7018f55d15fcb50cbcccfeff3feea433294ce78fd911a563de3145ed79" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.416727 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.458524 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-c47f5f4db-j6swx"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.470706 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-c47f5f4db-j6swx"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.477777 4921 scope.go:117] "RemoveContainer" containerID="7309c213d8e4e3cb567c026e50b6cdc87298f82a2247b0633434b2e4b5e65f3c" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.491392 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.503527 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.511141 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.518514 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.623036 4921 scope.go:117] "RemoveContainer" containerID="21996dae0b2690b430cf03b884c33f3ef56bfe6e6623a7ddd63437c6d50e1ff5" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.690647 4921 scope.go:117] "RemoveContainer" containerID="ec64628734d6d0cbd273786bbce180980c769e1b8b9804c0a9567e94a61a1793" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.722639 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.727677 4921 scope.go:117] "RemoveContainer" containerID="d1d72c66ce5a1eb5ac5faf2d0a921bafbdff4044fe05e8e42218253919f68d86" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.802263 4921 scope.go:117] "RemoveContainer" containerID="3aac4a7b1c7711e82d2c4070129f600f945e8a4cfb33bbcc42198ac9736afd8b" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.818452 4921 scope.go:117] "RemoveContainer" containerID="d3111e0bf89186d6ceed4c9cbb069267d0c8685ca3cf5793ac787eddf6cf6018" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839607 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839665 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839722 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-confd\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839745 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df692663-cc58-4cf1-a05b-566e0152ee90-pod-info\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839768 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s5fz\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-kube-api-access-7s5fz\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839824 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df692663-cc58-4cf1-a05b-566e0152ee90-erlang-cookie-secret\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839849 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-tls\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839865 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-server-conf\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839890 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-erlang-cookie\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839917 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-plugins-conf\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.839937 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-plugins\") pod \"df692663-cc58-4cf1-a05b-566e0152ee90\" (UID: \"df692663-cc58-4cf1-a05b-566e0152ee90\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.840371 4921 scope.go:117] "RemoveContainer" containerID="d769e05c72dbc3bfb0249264660c0d34a67576be10d78c4d0f42227432990ac5" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.840774 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.841155 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.841603 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.844974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/df692663-cc58-4cf1-a05b-566e0152ee90-pod-info" (OuterVolumeSpecName: "pod-info") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.845939 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.851712 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.851841 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df692663-cc58-4cf1-a05b-566e0152ee90-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.861462 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.863352 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-kube-api-access-7s5fz" (OuterVolumeSpecName: "kube-api-access-7s5fz") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "kube-api-access-7s5fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.880418 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data" (OuterVolumeSpecName: "config-data") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.923434 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-server-conf" (OuterVolumeSpecName: "server-conf") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.941056 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef935990-b291-43b7-9d56-673b7b05a7a7-pod-info\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.941106 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-plugins-conf\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.941159 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.941823 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.941976 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-tls\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942066 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkmdl\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-kube-api-access-vkmdl\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942092 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef935990-b291-43b7-9d56-673b7b05a7a7-erlang-cookie-secret\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942172 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-plugins\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942198 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-erlang-cookie\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942252 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942315 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-server-conf\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-confd\") pod \"ef935990-b291-43b7-9d56-673b7b05a7a7\" (UID: \"ef935990-b291-43b7-9d56-673b7b05a7a7\") " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942770 4921 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df692663-cc58-4cf1-a05b-566e0152ee90-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942788 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s5fz\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-kube-api-access-7s5fz\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942800 4921 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df692663-cc58-4cf1-a05b-566e0152ee90-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942811 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942821 4921 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942832 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942844 4921 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942855 4921 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942865 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942875 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df692663-cc58-4cf1-a05b-566e0152ee90-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.942928 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.943179 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.943837 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.948899 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef935990-b291-43b7-9d56-673b7b05a7a7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.948905 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ef935990-b291-43b7-9d56-673b7b05a7a7-pod-info" (OuterVolumeSpecName: "pod-info") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.948945 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.949080 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.949209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-kube-api-access-vkmdl" (OuterVolumeSpecName: "kube-api-access-vkmdl") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "kube-api-access-vkmdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.961562 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 18 12:35:30 crc kubenswrapper[4921]: I0318 12:35:30.987746 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "df692663-cc58-4cf1-a05b-566e0152ee90" (UID: "df692663-cc58-4cf1-a05b-566e0152ee90"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.029649 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data" (OuterVolumeSpecName: "config-data") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.037810 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-server-conf" (OuterVolumeSpecName: "server-conf") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044031 4921 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef935990-b291-43b7-9d56-673b7b05a7a7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044089 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044102 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044131 4921 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef935990-b291-43b7-9d56-673b7b05a7a7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044146 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkmdl\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-kube-api-access-vkmdl\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044155 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044167 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044177 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044188 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044196 4921 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef935990-b291-43b7-9d56-673b7b05a7a7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.044204 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df692663-cc58-4cf1-a05b-566e0152ee90-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.058140 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.072958 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ef935990-b291-43b7-9d56-673b7b05a7a7" (UID: "ef935990-b291-43b7-9d56-673b7b05a7a7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.145411 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef935990-b291-43b7-9d56-673b7b05a7a7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.145443 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.198697 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.204143 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0/ovn-northd/0.log" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.204196 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.221496 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" path="/var/lib/kubelet/pods/08667791-7c42-46d1-a74b-436dfefa5db3/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.222298 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1414d026-b9f7-4fb9-ae37-0de669bf759f" path="/var/lib/kubelet/pods/1414d026-b9f7-4fb9-ae37-0de669bf759f/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.222974 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" path="/var/lib/kubelet/pods/18566d04-485b-411a-a1b8-e761a7fa6933/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.224188 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a183a61-e314-4bd0-b332-3d216d70c6c2" path="/var/lib/kubelet/pods/1a183a61-e314-4bd0-b332-3d216d70c6c2/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.224714 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" path="/var/lib/kubelet/pods/3270a214-054c-4c39-aedc-9ba6fb58a7ae/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.225131 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41428169-d582-4ffb-9c81-fec1b99d13a3" path="/var/lib/kubelet/pods/41428169-d582-4ffb-9c81-fec1b99d13a3/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.225446 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" path="/var/lib/kubelet/pods/6fb2928a-2c88-4045-bc13-a7dca96f9639/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.226629 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" path="/var/lib/kubelet/pods/778f8baf-82ce-457d-b32d-35d3abe1a79d/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.228585 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" path="/var/lib/kubelet/pods/7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.232306 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" path="/var/lib/kubelet/pods/7c908fc5-8a5a-4d0e-8bf3-2b9964319d71/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.234009 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" path="/var/lib/kubelet/pods/b574f65d-9f59-41e3-bec6-59c25cc847fe/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.234947 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" path="/var/lib/kubelet/pods/cc7ad6a4-5d48-43b9-a85c-da13a0beed6a/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.236488 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2f76d2-7d0e-450c-8218-0cf40e03cbee" path="/var/lib/kubelet/pods/cf2f76d2-7d0e-450c-8218-0cf40e03cbee/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.237261 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3fdc858-ca78-4137-b287-a3015e80b660" path="/var/lib/kubelet/pods/f3fdc858-ca78-4137-b287-a3015e80b660/volumes" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250511 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-northd-tls-certs\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250565 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-operator-scripts\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250595 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-scripts\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250613 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkscf\" (UniqueName: \"kubernetes.io/projected/bb6e4980-bd4a-455e-924b-739cee9587c9-kube-api-access-xkscf\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250629 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-combined-ca-bundle\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250649 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250850 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-combined-ca-bundle\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250880 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnm5h\" (UniqueName: \"kubernetes.io/projected/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-kube-api-access-tnm5h\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250926 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-metrics-certs-tls-certs\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250943 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-galera-tls-certs\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.250989 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-generated\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.251024 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-rundir\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.251073 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-kolla-config\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.251091 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-config\") pod \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\" (UID: \"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.251128 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-default\") pod \"bb6e4980-bd4a-455e-924b-739cee9587c9\" (UID: \"bb6e4980-bd4a-455e-924b-739cee9587c9\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.255204 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.259851 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-scripts" (OuterVolumeSpecName: "scripts") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.259844 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.260333 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.260534 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.260604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.261755 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-config" (OuterVolumeSpecName: "config") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.266554 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-kube-api-access-tnm5h" (OuterVolumeSpecName: "kube-api-access-tnm5h") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "kube-api-access-tnm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.290928 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.297298 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6e4980-bd4a-455e-924b-739cee9587c9-kube-api-access-xkscf" (OuterVolumeSpecName: "kube-api-access-xkscf") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "kube-api-access-xkscf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.315165 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.324051 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.336957 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "bb6e4980-bd4a-455e-924b-739cee9587c9" (UID: "bb6e4980-bd4a-455e-924b-739cee9587c9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.338216 4921 generic.go:334] "Generic (PLEG): container finished" podID="5e6d9230-4481-43b3-891b-066a3bc6a46f" containerID="9cae82dc6d9adf77b41d7fa82acd4e2e90f6f5263c54efe2afe1b22b3ad1b371" exitCode=0 Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.343067 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.350302 4921 generic.go:334] "Generic (PLEG): container finished" podID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerID="0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222" exitCode=0 Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.350417 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.356394 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.356957 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkscf\" (UniqueName: \"kubernetes.io/projected/bb6e4980-bd4a-455e-924b-739cee9587c9-kube-api-access-xkscf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357126 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357232 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357458 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357549 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357632 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnm5h\" (UniqueName: \"kubernetes.io/projected/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-kube-api-access-tnm5h\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357707 4921 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb6e4980-bd4a-455e-924b-739cee9587c9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357775 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357850 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357920 4921 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.357990 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.358060 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb6e4980-bd4a-455e-924b-739cee9587c9-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.359653 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0/ovn-northd/0.log" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.359820 4921 generic.go:334] "Generic (PLEG): container finished" podID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" exitCode=139 Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.360016 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.365166 4921 generic.go:334] "Generic (PLEG): container finished" podID="df692663-cc58-4cf1-a05b-566e0152ee90" containerID="330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537" exitCode=0 Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.365414 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.368332 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.374021 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56b6658ccd-lzk2m" event={"ID":"5e6d9230-4481-43b3-891b-066a3bc6a46f","Type":"ContainerDied","Data":"9cae82dc6d9adf77b41d7fa82acd4e2e90f6f5263c54efe2afe1b22b3ad1b371"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400779 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bb6e4980-bd4a-455e-924b-739cee9587c9","Type":"ContainerDied","Data":"e92da77e6fa4610ea197df65bd241bebface9949245ec4d188f3e927c00df125"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400819 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef935990-b291-43b7-9d56-673b7b05a7a7","Type":"ContainerDied","Data":"0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400836 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef935990-b291-43b7-9d56-673b7b05a7a7","Type":"ContainerDied","Data":"4622bc4fb13d840177305177d8e970d9bb86f4736ee8dba42a0d3595e9fb72be"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400848 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0","Type":"ContainerDied","Data":"513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400862 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0","Type":"ContainerDied","Data":"5c457b678e38dcb5ff2c47a0a385a8fd32f276bf88160bc492c81ef53aca628e"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df692663-cc58-4cf1-a05b-566e0152ee90","Type":"ContainerDied","Data":"330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400888 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df692663-cc58-4cf1-a05b-566e0152ee90","Type":"ContainerDied","Data":"d8ad9642d887a2d3381872d62b39b35378edf7a96826efe3b9e9327cfa22a4dc"} Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.400913 4921 scope.go:117] "RemoveContainer" containerID="df86e6b2e89e05d1521d2d08a10069a06f4b099a6a5c7c36becd97a06ed722ff" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.432448 4921 scope.go:117] "RemoveContainer" containerID="ee45e93de09bd0d89582c74ef72aeca3a1f8a1f9676053db278804ba3c1bdc92" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.436689 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.437486 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" (UID: "eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.441400 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.458972 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.459001 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.459019 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.469821 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.478610 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.479946 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.487007 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.487081 4921 scope.go:117] "RemoveContainer" containerID="0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.515655 4921 scope.go:117] "RemoveContainer" containerID="a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.547101 4921 scope.go:117] "RemoveContainer" containerID="0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.547652 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222\": container with ID starting with 0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222 not found: ID does not exist" containerID="0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.547707 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222"} err="failed to get container status \"0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222\": rpc error: code = NotFound desc = could not find container \"0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222\": container with ID starting with 0c7c012edc6c8315566bd5136d20623d7619372c6428dc30d1008bb4e2f02222 not found: ID does not exist" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.547736 4921 scope.go:117] "RemoveContainer" containerID="a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.548004 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed\": container with ID starting with a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed not found: ID does not exist" containerID="a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.548020 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed"} err="failed to get container status \"a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed\": rpc error: code = NotFound desc = could not find container \"a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed\": container with ID starting with a90c673a8e1d22bbd02b8f4fbe1ec39b96136f75649818be201720f3003044ed not found: ID does not exist" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.548033 4921 scope.go:117] "RemoveContainer" containerID="487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.564317 4921 scope.go:117] "RemoveContainer" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.602251 4921 scope.go:117] "RemoveContainer" containerID="487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.605469 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4\": container with ID starting with 487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4 not found: ID does not exist" containerID="487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.605512 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4"} err="failed to get container status \"487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4\": rpc error: code = NotFound desc = could not find container \"487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4\": container with ID starting with 487289d3aa219168dce4ed9d7716ee1e378b00d22ead4f8610311f8791bef7c4 not found: ID does not exist" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.605537 4921 scope.go:117] "RemoveContainer" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.609730 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3\": container with ID starting with 513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3 not found: ID does not exist" containerID="513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.609770 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3"} err="failed to get container status \"513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3\": rpc error: code = NotFound desc = could not find container \"513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3\": container with ID starting with 513ae584b22af6da2ba1f5192d3e68417b82f47b5273af87256b87f2db9d4cd3 not found: ID does not exist" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.609853 4921 scope.go:117] "RemoveContainer" containerID="330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.668369 4921 scope.go:117] "RemoveContainer" containerID="dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.745428 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.745832 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.751379 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.754089 4921 scope.go:117] "RemoveContainer" containerID="330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.754598 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537\": container with ID starting with 330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537 not found: ID does not exist" containerID="330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.754630 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537"} err="failed to get container status \"330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537\": rpc error: code = NotFound desc = could not find container \"330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537\": container with ID starting with 330b6bed940c0f210dc08a9cd5cb4308207477e3c1b352c5af20646028c7c537 not found: ID does not exist" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.754650 4921 scope.go:117] "RemoveContainer" containerID="dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.754980 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d\": container with ID starting with dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d not found: ID does not exist" containerID="dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.755011 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d"} err="failed to get container status \"dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d\": rpc error: code = NotFound desc = could not find container \"dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d\": container with ID starting with dd08d2b470c4f88c4b45cba72ac29a7b126b4c6453b116e1f73316f88da6956d not found: ID does not exist" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.777291 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.777821 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.778089 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.778130 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.778901 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.780149 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.781071 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:31 crc kubenswrapper[4921]: E0318 12:35:31.781099 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864286 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-public-tls-certs\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864342 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnl22\" (UniqueName: \"kubernetes.io/projected/5e6d9230-4481-43b3-891b-066a3bc6a46f-kube-api-access-rnl22\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864387 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-internal-tls-certs\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864454 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-scripts\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864478 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-credential-keys\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864549 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-config-data\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864581 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-fernet-keys\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.864642 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-combined-ca-bundle\") pod \"5e6d9230-4481-43b3-891b-066a3bc6a46f\" (UID: \"5e6d9230-4481-43b3-891b-066a3bc6a46f\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.868310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.869792 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-scripts" (OuterVolumeSpecName: "scripts") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.870256 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6d9230-4481-43b3-891b-066a3bc6a46f-kube-api-access-rnl22" (OuterVolumeSpecName: "kube-api-access-rnl22") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "kube-api-access-rnl22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.872276 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.886627 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-config-data" (OuterVolumeSpecName: "config-data") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.897470 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.916850 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.923610 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5e6d9230-4481-43b3-891b-066a3bc6a46f" (UID: "5e6d9230-4481-43b3-891b-066a3bc6a46f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.924638 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966323 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-log-httpd\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966387 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-ceilometer-tls-certs\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-run-httpd\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966467 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-sg-core-conf-yaml\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966513 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-config-data\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966556 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blkmp\" (UniqueName: \"kubernetes.io/projected/4c22e952-1a8a-4998-bcc4-72114cb84c82-kube-api-access-blkmp\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966613 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-combined-ca-bundle\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966639 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-scripts\") pod \"4c22e952-1a8a-4998-bcc4-72114cb84c82\" (UID: \"4c22e952-1a8a-4998-bcc4-72114cb84c82\") " Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966711 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.966837 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967266 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967295 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967307 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967317 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c22e952-1a8a-4998-bcc4-72114cb84c82-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967327 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967339 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967349 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnl22\" (UniqueName: \"kubernetes.io/projected/5e6d9230-4481-43b3-891b-066a3bc6a46f-kube-api-access-rnl22\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967361 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967372 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.967383 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e6d9230-4481-43b3-891b-066a3bc6a46f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.972578 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-scripts" (OuterVolumeSpecName: "scripts") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:31 crc kubenswrapper[4921]: I0318 12:35:31.972640 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c22e952-1a8a-4998-bcc4-72114cb84c82-kube-api-access-blkmp" (OuterVolumeSpecName: "kube-api-access-blkmp") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "kube-api-access-blkmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.001415 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.018652 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.051593 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.056896 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-config-data" (OuterVolumeSpecName: "config-data") pod "4c22e952-1a8a-4998-bcc4-72114cb84c82" (UID: "4c22e952-1a8a-4998-bcc4-72114cb84c82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.069432 4921 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.069464 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.069474 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.069482 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blkmp\" (UniqueName: \"kubernetes.io/projected/4c22e952-1a8a-4998-bcc4-72114cb84c82-kube-api-access-blkmp\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.069491 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.069498 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c22e952-1a8a-4998-bcc4-72114cb84c82-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.182767 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.271213 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-combined-ca-bundle\") pod \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.271315 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4jf\" (UniqueName: \"kubernetes.io/projected/f4f8bb3e-7a43-4a4d-8012-143658c681fc-kube-api-access-qb4jf\") pod \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.271388 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-config-data\") pod \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\" (UID: \"f4f8bb3e-7a43-4a4d-8012-143658c681fc\") " Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.288623 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f8bb3e-7a43-4a4d-8012-143658c681fc-kube-api-access-qb4jf" (OuterVolumeSpecName: "kube-api-access-qb4jf") pod "f4f8bb3e-7a43-4a4d-8012-143658c681fc" (UID: "f4f8bb3e-7a43-4a4d-8012-143658c681fc"). InnerVolumeSpecName "kube-api-access-qb4jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.292284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4f8bb3e-7a43-4a4d-8012-143658c681fc" (UID: "f4f8bb3e-7a43-4a4d-8012-143658c681fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.292828 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-config-data" (OuterVolumeSpecName: "config-data") pod "f4f8bb3e-7a43-4a4d-8012-143658c681fc" (UID: "f4f8bb3e-7a43-4a4d-8012-143658c681fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.373136 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.373190 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4jf\" (UniqueName: \"kubernetes.io/projected/f4f8bb3e-7a43-4a4d-8012-143658c681fc-kube-api-access-qb4jf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.373203 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f8bb3e-7a43-4a4d-8012-143658c681fc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.412321 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" exitCode=0 Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.412376 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f4f8bb3e-7a43-4a4d-8012-143658c681fc","Type":"ContainerDied","Data":"4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec"} Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.412411 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.412438 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f4f8bb3e-7a43-4a4d-8012-143658c681fc","Type":"ContainerDied","Data":"d9fb53a4ac35892c34734ee91b218f6b12a7828255b417f284badbbfa5f0e628"} Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.412485 4921 scope.go:117] "RemoveContainer" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.418995 4921 generic.go:334] "Generic (PLEG): container finished" podID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerID="ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930" exitCode=0 Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.419073 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerDied","Data":"ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930"} Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.419097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c22e952-1a8a-4998-bcc4-72114cb84c82","Type":"ContainerDied","Data":"20f820eb54df8a34d7de96aef11035059e28341a1f438ef8bd31c62bec5925ef"} Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.419180 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.425544 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56b6658ccd-lzk2m" event={"ID":"5e6d9230-4481-43b3-891b-066a3bc6a46f","Type":"ContainerDied","Data":"48f7f18f2360134f5a41a86ebfb15d62951af8a158fe8a854d0c63aba32b959c"} Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.425577 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56b6658ccd-lzk2m" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.462435 4921 scope.go:117] "RemoveContainer" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" Mar 18 12:35:32 crc kubenswrapper[4921]: E0318 12:35:32.462821 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec\": container with ID starting with 4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec not found: ID does not exist" containerID="4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.462867 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec"} err="failed to get container status \"4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec\": rpc error: code = NotFound desc = could not find container \"4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec\": container with ID starting with 4738c39d10280b081c91ab308637d47083fe33c70f6e8c54922804644d75c1ec not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.462895 4921 scope.go:117] "RemoveContainer" containerID="cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.465188 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.482588 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.489062 4921 scope.go:117] "RemoveContainer" containerID="61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.508571 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.522623 4921 scope.go:117] "RemoveContainer" containerID="ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.525031 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.532267 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56b6658ccd-lzk2m"] Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.535877 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-56b6658ccd-lzk2m"] Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.543197 4921 scope.go:117] "RemoveContainer" containerID="a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.560878 4921 scope.go:117] "RemoveContainer" containerID="cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c" Mar 18 12:35:32 crc kubenswrapper[4921]: E0318 12:35:32.561482 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c\": container with ID starting with cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c not found: ID does not exist" containerID="cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.561539 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c"} err="failed to get container status \"cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c\": rpc error: code = NotFound desc = could not find container \"cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c\": container with ID starting with cadcc359887f9d6e2951fa80de1c21a0c76bae6deed0fe8ff08cb56167ff9a1c not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.561577 4921 scope.go:117] "RemoveContainer" containerID="61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033" Mar 18 12:35:32 crc kubenswrapper[4921]: E0318 12:35:32.561833 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033\": container with ID starting with 61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033 not found: ID does not exist" containerID="61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.561859 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033"} err="failed to get container status \"61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033\": rpc error: code = NotFound desc = could not find container \"61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033\": container with ID starting with 61759172485b45c8a300890f8684b4c562a8a2f040e40879f5e23cd8eb602033 not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.561878 4921 scope.go:117] "RemoveContainer" containerID="ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930" Mar 18 12:35:32 crc kubenswrapper[4921]: E0318 12:35:32.562093 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930\": container with ID starting with ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930 not found: ID does not exist" containerID="ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.562137 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930"} err="failed to get container status \"ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930\": rpc error: code = NotFound desc = could not find container \"ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930\": container with ID starting with ac229167c39daddbb35b9acdb392b56ec127eac0b995bc404b1d970af1e93930 not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.562195 4921 scope.go:117] "RemoveContainer" containerID="a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510" Mar 18 12:35:32 crc kubenswrapper[4921]: E0318 12:35:32.562552 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510\": container with ID starting with a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510 not found: ID does not exist" containerID="a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.562585 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510"} err="failed to get container status \"a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510\": rpc error: code = NotFound desc = could not find container \"a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510\": container with ID starting with a0b4ca994e175a05073fdcd366044a15e441ad883c2b98ef9e073ca866062510 not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4921]: I0318 12:35:32.562606 4921 scope.go:117] "RemoveContainer" containerID="9cae82dc6d9adf77b41d7fa82acd4e2e90f6f5263c54efe2afe1b22b3ad1b371" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.217594 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" path="/var/lib/kubelet/pods/4c22e952-1a8a-4998-bcc4-72114cb84c82/volumes" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.218439 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6d9230-4481-43b3-891b-066a3bc6a46f" path="/var/lib/kubelet/pods/5e6d9230-4481-43b3-891b-066a3bc6a46f/volumes" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.219153 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" path="/var/lib/kubelet/pods/bb6e4980-bd4a-455e-924b-739cee9587c9/volumes" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.220421 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" path="/var/lib/kubelet/pods/df692663-cc58-4cf1-a05b-566e0152ee90/volumes" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.221052 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" path="/var/lib/kubelet/pods/eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0/volumes" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.222207 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" path="/var/lib/kubelet/pods/ef935990-b291-43b7-9d56-673b7b05a7a7/volumes" Mar 18 12:35:33 crc kubenswrapper[4921]: I0318 12:35:33.222811 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" path="/var/lib/kubelet/pods/f4f8bb3e-7a43-4a4d-8012-143658c681fc/volumes" Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.774241 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.774890 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.775204 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.775230 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.776715 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.777815 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.778909 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:36 crc kubenswrapper[4921]: E0318 12:35:36.778938 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.776202 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.777095 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.777783 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.778523 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.778589 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.778616 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.789822 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:41 crc kubenswrapper[4921]: E0318 12:35:41.789958 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:35:42 crc kubenswrapper[4921]: I0318 12:35:42.676962 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-86b7dc884f-42l8h" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": dial tcp 10.217.0.161:9696: connect: connection refused" Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.551094 4921 generic.go:334] "Generic (PLEG): container finished" podID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerID="93b745602f033ec1a6fdfa900798b921c5b73da266fffb39f1ceaa11e6e673d5" exitCode=0 Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.551142 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86b7dc884f-42l8h" event={"ID":"a688bd96-47e0-4ae4-8e94-3c44f964b9e0","Type":"ContainerDied","Data":"93b745602f033ec1a6fdfa900798b921c5b73da266fffb39f1ceaa11e6e673d5"} Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.887899 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.987824 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-httpd-config\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.987974 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-internal-tls-certs\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.988010 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-ovndb-tls-certs\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.988088 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-combined-ca-bundle\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.988132 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgl6j\" (UniqueName: \"kubernetes.io/projected/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-kube-api-access-xgl6j\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.988229 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-config\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.988247 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-public-tls-certs\") pod \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\" (UID: \"a688bd96-47e0-4ae4-8e94-3c44f964b9e0\") " Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.994037 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-kube-api-access-xgl6j" (OuterVolumeSpecName: "kube-api-access-xgl6j") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "kube-api-access-xgl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:45 crc kubenswrapper[4921]: I0318 12:35:45.994357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.029657 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.030165 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.038870 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.043721 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-config" (OuterVolumeSpecName: "config") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.052225 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a688bd96-47e0-4ae4-8e94-3c44f964b9e0" (UID: "a688bd96-47e0-4ae4-8e94-3c44f964b9e0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.090368 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.090640 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgl6j\" (UniqueName: \"kubernetes.io/projected/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-kube-api-access-xgl6j\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.090736 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.090821 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.090900 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.090978 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.091093 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a688bd96-47e0-4ae4-8e94-3c44f964b9e0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.563219 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86b7dc884f-42l8h" event={"ID":"a688bd96-47e0-4ae4-8e94-3c44f964b9e0","Type":"ContainerDied","Data":"9b0b18186784b204a5f8998384f5413e967eea52ad4d64ba6bc5dc709997ff2b"} Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.563307 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86b7dc884f-42l8h" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.563316 4921 scope.go:117] "RemoveContainer" containerID="39e6a8d2a389dad40dfed60f96860330eb4be0119c8e89e962a17c2926a11993" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.598413 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86b7dc884f-42l8h"] Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.606385 4921 scope.go:117] "RemoveContainer" containerID="93b745602f033ec1a6fdfa900798b921c5b73da266fffb39f1ceaa11e6e673d5" Mar 18 12:35:46 crc kubenswrapper[4921]: I0318 12:35:46.607518 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-86b7dc884f-42l8h"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.774204 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.775067 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.775440 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.775517 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.775931 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.777241 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.778425 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:46 crc kubenswrapper[4921]: E0318 12:35:46.778476 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.081140 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.081204 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.081244 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.081860 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.081922 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" gracePeriod=600 Mar 18 12:35:47 crc kubenswrapper[4921]: E0318 12:35:47.204454 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.220365 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" path="/var/lib/kubelet/pods/a688bd96-47e0-4ae4-8e94-3c44f964b9e0/volumes" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.574939 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" exitCode=0 Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.575001 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd"} Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.575038 4921 scope.go:117] "RemoveContainer" containerID="ecb2d426426fe45a8d3167724569351edcb20678eccb43a43192be5b68165da4" Mar 18 12:35:47 crc kubenswrapper[4921]: I0318 12:35:47.575596 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:35:47 crc kubenswrapper[4921]: E0318 12:35:47.576151 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.774804 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.776357 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.776661 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.776698 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.777069 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.779262 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.781173 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 12:35:51 crc kubenswrapper[4921]: E0318 12:35:51.781225 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bg8nq" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.617284 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bg8nq_76380191-f4a9-4690-bb6e-cb85ad794e33/ovs-vswitchd/0.log" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.618501 4921 generic.go:334] "Generic (PLEG): container finished" podID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" exitCode=137 Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.618543 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerDied","Data":"8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1"} Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.849809 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bg8nq_76380191-f4a9-4690-bb6e-cb85ad794e33/ovs-vswitchd/0.log" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.851147 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903069 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-log\") pod \"76380191-f4a9-4690-bb6e-cb85ad794e33\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903134 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-run\") pod \"76380191-f4a9-4690-bb6e-cb85ad794e33\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903172 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-etc-ovs\") pod \"76380191-f4a9-4690-bb6e-cb85ad794e33\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903218 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-log" (OuterVolumeSpecName: "var-log") pod "76380191-f4a9-4690-bb6e-cb85ad794e33" (UID: "76380191-f4a9-4690-bb6e-cb85ad794e33"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903249 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-lib\") pod \"76380191-f4a9-4690-bb6e-cb85ad794e33\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "76380191-f4a9-4690-bb6e-cb85ad794e33" (UID: "76380191-f4a9-4690-bb6e-cb85ad794e33"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76380191-f4a9-4690-bb6e-cb85ad794e33-scripts\") pod \"76380191-f4a9-4690-bb6e-cb85ad794e33\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903435 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v92nk\" (UniqueName: \"kubernetes.io/projected/76380191-f4a9-4690-bb6e-cb85ad794e33-kube-api-access-v92nk\") pod \"76380191-f4a9-4690-bb6e-cb85ad794e33\" (UID: \"76380191-f4a9-4690-bb6e-cb85ad794e33\") " Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903224 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-run" (OuterVolumeSpecName: "var-run") pod "76380191-f4a9-4690-bb6e-cb85ad794e33" (UID: "76380191-f4a9-4690-bb6e-cb85ad794e33"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903295 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-lib" (OuterVolumeSpecName: "var-lib") pod "76380191-f4a9-4690-bb6e-cb85ad794e33" (UID: "76380191-f4a9-4690-bb6e-cb85ad794e33"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.903997 4921 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.904018 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.904028 4921 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.904037 4921 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/76380191-f4a9-4690-bb6e-cb85ad794e33-var-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.904437 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76380191-f4a9-4690-bb6e-cb85ad794e33-scripts" (OuterVolumeSpecName: "scripts") pod "76380191-f4a9-4690-bb6e-cb85ad794e33" (UID: "76380191-f4a9-4690-bb6e-cb85ad794e33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:52 crc kubenswrapper[4921]: I0318 12:35:52.922989 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76380191-f4a9-4690-bb6e-cb85ad794e33-kube-api-access-v92nk" (OuterVolumeSpecName: "kube-api-access-v92nk") pod "76380191-f4a9-4690-bb6e-cb85ad794e33" (UID: "76380191-f4a9-4690-bb6e-cb85ad794e33"). InnerVolumeSpecName "kube-api-access-v92nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.005253 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v92nk\" (UniqueName: \"kubernetes.io/projected/76380191-f4a9-4690-bb6e-cb85ad794e33-kube-api-access-v92nk\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.005292 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76380191-f4a9-4690-bb6e-cb85ad794e33-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.635892 4921 generic.go:334] "Generic (PLEG): container finished" podID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerID="5417c0aaf863e518433565af42abbad6a0c5b335eef0766c35d94f92e5627f39" exitCode=137 Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.636004 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"5417c0aaf863e518433565af42abbad6a0c5b335eef0766c35d94f92e5627f39"} Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.639196 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bg8nq_76380191-f4a9-4690-bb6e-cb85ad794e33/ovs-vswitchd/0.log" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.641019 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bg8nq" event={"ID":"76380191-f4a9-4690-bb6e-cb85ad794e33","Type":"ContainerDied","Data":"b2567843608c14304b7309cb15eb4c236be3e49a9f1421e9d1c36309289091e1"} Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.641060 4921 scope.go:117] "RemoveContainer" containerID="8ed7e73d78bfaf5a473d79796a10418b3806168e85f6d1602c99d39a31bb00c1" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.641242 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bg8nq" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.663735 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-bg8nq"] Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.679430 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-bg8nq"] Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.680378 4921 scope.go:117] "RemoveContainer" containerID="78090a62f42de541b9a8c0bc400794dc59540e6de062609cbeadac150e33b29b" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.707344 4921 scope.go:117] "RemoveContainer" containerID="12a442cf99669b3576078f8841bcba37f822c4c9adab4bd21d43b92d7f71b599" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.818615 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.916525 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2hwm\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-kube-api-access-q2hwm\") pod \"2204df50-7907-4d3b-a8b3-5aee222044f2\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.916626 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204df50-7907-4d3b-a8b3-5aee222044f2-combined-ca-bundle\") pod \"2204df50-7907-4d3b-a8b3-5aee222044f2\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.916664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") pod \"2204df50-7907-4d3b-a8b3-5aee222044f2\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.916685 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-cache\") pod \"2204df50-7907-4d3b-a8b3-5aee222044f2\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.916707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2204df50-7907-4d3b-a8b3-5aee222044f2\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.916739 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-lock\") pod \"2204df50-7907-4d3b-a8b3-5aee222044f2\" (UID: \"2204df50-7907-4d3b-a8b3-5aee222044f2\") " Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.917507 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-cache" (OuterVolumeSpecName: "cache") pod "2204df50-7907-4d3b-a8b3-5aee222044f2" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.917596 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-lock" (OuterVolumeSpecName: "lock") pod "2204df50-7907-4d3b-a8b3-5aee222044f2" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.920436 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2204df50-7907-4d3b-a8b3-5aee222044f2" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.920690 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "2204df50-7907-4d3b-a8b3-5aee222044f2" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:53 crc kubenswrapper[4921]: I0318 12:35:53.921487 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-kube-api-access-q2hwm" (OuterVolumeSpecName: "kube-api-access-q2hwm") pod "2204df50-7907-4d3b-a8b3-5aee222044f2" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2"). InnerVolumeSpecName "kube-api-access-q2hwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.018096 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2hwm\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-kube-api-access-q2hwm\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.018195 4921 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2204df50-7907-4d3b-a8b3-5aee222044f2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.018206 4921 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-cache\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.018233 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.018242 4921 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2204df50-7907-4d3b-a8b3-5aee222044f2-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.036006 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.119434 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.162624 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2204df50-7907-4d3b-a8b3-5aee222044f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2204df50-7907-4d3b-a8b3-5aee222044f2" (UID: "2204df50-7907-4d3b-a8b3-5aee222044f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.220550 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2204df50-7907-4d3b-a8b3-5aee222044f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.656750 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2204df50-7907-4d3b-a8b3-5aee222044f2","Type":"ContainerDied","Data":"f1267c5864f022b9dd760cd059eaba11c12e6c2923d4411e4de36265f39ab3d7"} Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.656829 4921 scope.go:117] "RemoveContainer" containerID="5417c0aaf863e518433565af42abbad6a0c5b335eef0766c35d94f92e5627f39" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.657078 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.698428 4921 scope.go:117] "RemoveContainer" containerID="d92a64abd1a53e46dcafdafcc8d4d1c74904044d5ba50721426f103d435d57d1" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.699239 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.713925 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.720334 4921 scope.go:117] "RemoveContainer" containerID="e1f5e00485fe3c35e3ec69acbb2f60126c30dad072a5acf86f531dc05351e016" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.769874 4921 scope.go:117] "RemoveContainer" containerID="a68d70fd22c882e995ded0c62216a18073ba6612f41913c598c593d06c61a6b2" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.793084 4921 scope.go:117] "RemoveContainer" containerID="0ab56600506809bec91a2b7ae6b9bf4d001cdb5c75b88b21a9af00d3e3d40e90" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.812012 4921 scope.go:117] "RemoveContainer" containerID="fbed1f40b33a5fa1094364d62d483881bd04228924310bd51d1435c0c89e479b" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.833059 4921 scope.go:117] "RemoveContainer" containerID="6fa2fcae87945f7dd516860dd504658f8b9dc554af18972aba630feda6408da7" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.852752 4921 scope.go:117] "RemoveContainer" containerID="6ce294b06257e3ab13597923a81988d1346d5378dc3751bcc0f9a4ac4134d520" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.875454 4921 scope.go:117] "RemoveContainer" containerID="1c245944ddb7cd5c122c6cc477fd4b8c17707a0b034cb749ccf88bd64991b476" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.896626 4921 scope.go:117] "RemoveContainer" containerID="f44537122f931a3e66acf483b594422f9af64976005b3c0018487d261e996304" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.918249 4921 scope.go:117] "RemoveContainer" containerID="1f7ae1ab24fc1064033fd0503c3706bdd8dbdc4d41ba5cab405e7ab75a73598f" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.945252 4921 scope.go:117] "RemoveContainer" containerID="1777c4dc6cc0ebbcf08c1415f64541bce60850c8378a90e1f39c95269a83f819" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.963080 4921 scope.go:117] "RemoveContainer" containerID="89a3992a11b9a42578661ade69e99403032115ef433aaf0df1389b585d36e00b" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.980068 4921 scope.go:117] "RemoveContainer" containerID="72ba457268b54fa0d33c7866b23bca8be1894d0a484abe9be4ab2fd6c11abae3" Mar 18 12:35:54 crc kubenswrapper[4921]: I0318 12:35:54.996841 4921 scope.go:117] "RemoveContainer" containerID="0b7785aa69c2d4d5a0513e84fe33227f3ad20c98b78d1dcca6b047589db0a914" Mar 18 12:35:55 crc kubenswrapper[4921]: I0318 12:35:55.219898 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" path="/var/lib/kubelet/pods/2204df50-7907-4d3b-a8b3-5aee222044f2/volumes" Mar 18 12:35:55 crc kubenswrapper[4921]: I0318 12:35:55.222149 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" path="/var/lib/kubelet/pods/76380191-f4a9-4690-bb6e-cb85ad794e33/volumes" Mar 18 12:35:59 crc kubenswrapper[4921]: I0318 12:35:59.208989 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:35:59 crc kubenswrapper[4921]: E0318 12:35:59.209548 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155043 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563956-542rb"] Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155633 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155681 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155703 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerName="mariadb-account-create-update" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155715 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerName="mariadb-account-create-update" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155726 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="sg-core" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155736 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="sg-core" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155757 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="swift-recon-cron" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155768 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="swift-recon-cron" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155785 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fdc858-ca78-4137-b287-a3015e80b660" containerName="nova-scheduler-scheduler" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155795 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fdc858-ca78-4137-b287-a3015e80b660" containerName="nova-scheduler-scheduler" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155823 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="rabbitmq" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155834 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="rabbitmq" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155861 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-updater" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155872 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-updater" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155917 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="setup-container" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155928 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="setup-container" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155942 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155951 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155968 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-updater" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155977 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-updater" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.155986 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.155996 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156016 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156025 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156036 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156045 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156058 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6d9230-4481-43b3-891b-066a3bc6a46f" containerName="keystone-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156068 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6d9230-4481-43b3-891b-066a3bc6a46f" containerName="keystone-api" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156079 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" containerName="nova-cell0-conductor-conductor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156088 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" containerName="nova-cell0-conductor-conductor" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156101 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156129 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156145 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156154 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156165 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156174 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156183 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156191 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156204 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156213 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156228 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156237 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156254 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156262 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-server" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156275 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156285 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156300 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156310 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156328 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="setup-container" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156337 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="setup-container" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156347 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-reaper" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156356 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-reaper" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156372 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156381 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-api" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156396 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156404 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-api" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156415 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156423 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156437 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server-init" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156445 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server-init" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156458 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156467 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156481 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-metadata" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156489 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-metadata" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156504 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156512 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156528 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1414d026-b9f7-4fb9-ae37-0de669bf759f" containerName="kube-state-metrics" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156536 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1414d026-b9f7-4fb9-ae37-0de669bf759f" containerName="kube-state-metrics" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156550 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="mysql-bootstrap" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156559 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="mysql-bootstrap" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156569 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="openstack-network-exporter" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156577 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="openstack-network-exporter" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156586 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-expirer" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156594 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-expirer" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156612 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156620 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156631 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="proxy-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156639 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="proxy-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156655 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="rsync" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156665 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="rsync" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156675 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156684 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156697 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156706 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156716 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156724 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156735 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156743 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156755 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156763 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156779 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="galera" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156788 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="galera" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156802 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156810 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-log" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156820 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-notification-agent" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156829 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-notification-agent" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156840 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156848 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156862 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2f76d2-7d0e-450c-8218-0cf40e03cbee" containerName="memcached" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156871 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2f76d2-7d0e-450c-8218-0cf40e03cbee" containerName="memcached" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156884 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156893 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-server" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156906 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="ovn-northd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156915 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="ovn-northd" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156930 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="rabbitmq" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156939 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="rabbitmq" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156951 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-central-agent" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156959 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-central-agent" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156972 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.156981 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-server" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.156997 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157006 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-api" Mar 18 12:36:00 crc kubenswrapper[4921]: E0318 12:36:00.157022 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a183a61-e314-4bd0-b332-3d216d70c6c2" containerName="nova-cell1-conductor-conductor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157032 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a183a61-e314-4bd0-b332-3d216d70c6c2" containerName="nova-cell1-conductor-conductor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157243 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157282 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157297 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157313 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="openstack-network-exporter" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157326 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157336 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovsdb-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157352 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2f76d2-7d0e-450c-8218-0cf40e03cbee" containerName="memcached" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157364 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157375 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="rsync" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157385 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157400 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157413 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerName="mariadb-account-create-update" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157422 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6d9230-4481-43b3-891b-066a3bc6a46f" containerName="keystone-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157431 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157440 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157450 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3270a214-054c-4c39-aedc-9ba6fb58a7ae" containerName="nova-metadata-metadata" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157459 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="778f8baf-82ce-457d-b32d-35d3abe1a79d" containerName="glance-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157470 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157480 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-updater" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157489 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-central-agent" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157502 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-reaper" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157511 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-updater" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157519 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157529 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1414d026-b9f7-4fb9-ae37-0de669bf759f" containerName="kube-state-metrics" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157539 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b574f65d-9f59-41e3-bec6-59c25cc847fe" containerName="cinder-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157549 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="swift-recon-cron" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157562 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3fdc858-ca78-4137-b287-a3015e80b660" containerName="nova-scheduler-scheduler" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157575 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3271455-7c85-4b68-a27f-fb648ae6abc9" containerName="nova-api-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f8bb3e-7a43-4a4d-8012-143658c681fc" containerName="nova-cell0-conductor-conductor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157600 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef935990-b291-43b7-9d56-673b7b05a7a7" containerName="rabbitmq" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157611 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157625 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6e4980-bd4a-455e-924b-739cee9587c9" containerName="galera" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157636 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef9b9cb-5b95-4fec-b2c7-905ddb5aebe0" containerName="ovn-northd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157648 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ada97c3-d3f9-4fd7-9aeb-5ac7d45b46bf" containerName="glance-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157662 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c908fc5-8a5a-4d0e-8bf3-2b9964319d71" containerName="placement-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157672 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a183a61-e314-4bd0-b332-3d216d70c6c2" containerName="nova-cell1-conductor-conductor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157684 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157693 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-expirer" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157703 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="76380191-f4a9-4690-bb6e-cb85ad794e33" containerName="ovs-vswitchd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157713 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="proxy-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157724 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc7ad6a4-5d48-43b9-a85c-da13a0beed6a" containerName="barbican-worker" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157734 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="sg-core" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157746 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="account-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157756 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-replicator" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157768 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-httpd" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157778 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c22e952-1a8a-4998-bcc4-72114cb84c82" containerName="ceilometer-notification-agent" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157791 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="08667791-7c42-46d1-a74b-436dfefa5db3" containerName="barbican-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157804 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="df692663-cc58-4cf1-a05b-566e0152ee90" containerName="rabbitmq" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157817 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="object-server" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157832 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener-log" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157843 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="18566d04-485b-411a-a1b8-e761a7fa6933" containerName="barbican-keystone-listener" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157856 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a688bd96-47e0-4ae4-8e94-3c44f964b9e0" containerName="neutron-api" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157871 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerName="mariadb-account-create-update" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.157884 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2204df50-7907-4d3b-a8b3-5aee222044f2" containerName="container-auditor" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.158825 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.161515 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.161823 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.162541 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.164364 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-542rb"] Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.203665 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsf7t\" (UniqueName: \"kubernetes.io/projected/64530c6a-e5d5-467c-a687-7adfb6512cc8-kube-api-access-qsf7t\") pod \"auto-csr-approver-29563956-542rb\" (UID: \"64530c6a-e5d5-467c-a687-7adfb6512cc8\") " pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.305407 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsf7t\" (UniqueName: \"kubernetes.io/projected/64530c6a-e5d5-467c-a687-7adfb6512cc8-kube-api-access-qsf7t\") pod \"auto-csr-approver-29563956-542rb\" (UID: \"64530c6a-e5d5-467c-a687-7adfb6512cc8\") " pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.333030 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsf7t\" (UniqueName: \"kubernetes.io/projected/64530c6a-e5d5-467c-a687-7adfb6512cc8-kube-api-access-qsf7t\") pod \"auto-csr-approver-29563956-542rb\" (UID: \"64530c6a-e5d5-467c-a687-7adfb6512cc8\") " pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.489002 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:00 crc kubenswrapper[4921]: I0318 12:36:00.942705 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-542rb"] Mar 18 12:36:01 crc kubenswrapper[4921]: I0318 12:36:01.728738 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-542rb" event={"ID":"64530c6a-e5d5-467c-a687-7adfb6512cc8","Type":"ContainerStarted","Data":"2f11708b8588bf5822954bf090fbd73677f8f3b90f9269e9db2c40778b35556a"} Mar 18 12:36:02 crc kubenswrapper[4921]: I0318 12:36:02.741195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-542rb" event={"ID":"64530c6a-e5d5-467c-a687-7adfb6512cc8","Type":"ContainerStarted","Data":"c7b9f767f551f95ec6b587b136e37db572f8e86c44fc510b665a1a545c0a7870"} Mar 18 12:36:02 crc kubenswrapper[4921]: I0318 12:36:02.761269 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563956-542rb" podStartSLOduration=1.288707732 podStartE2EDuration="2.761243989s" podCreationTimestamp="2026-03-18 12:36:00 +0000 UTC" firstStartedPulling="2026-03-18 12:36:00.953429845 +0000 UTC m=+1580.503350484" lastFinishedPulling="2026-03-18 12:36:02.425966102 +0000 UTC m=+1581.975886741" observedRunningTime="2026-03-18 12:36:02.75987015 +0000 UTC m=+1582.309790789" watchObservedRunningTime="2026-03-18 12:36:02.761243989 +0000 UTC m=+1582.311164648" Mar 18 12:36:03 crc kubenswrapper[4921]: I0318 12:36:03.754592 4921 generic.go:334] "Generic (PLEG): container finished" podID="64530c6a-e5d5-467c-a687-7adfb6512cc8" containerID="c7b9f767f551f95ec6b587b136e37db572f8e86c44fc510b665a1a545c0a7870" exitCode=0 Mar 18 12:36:03 crc kubenswrapper[4921]: I0318 12:36:03.754644 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-542rb" event={"ID":"64530c6a-e5d5-467c-a687-7adfb6512cc8","Type":"ContainerDied","Data":"c7b9f767f551f95ec6b587b136e37db572f8e86c44fc510b665a1a545c0a7870"} Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.054556 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.175006 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsf7t\" (UniqueName: \"kubernetes.io/projected/64530c6a-e5d5-467c-a687-7adfb6512cc8-kube-api-access-qsf7t\") pod \"64530c6a-e5d5-467c-a687-7adfb6512cc8\" (UID: \"64530c6a-e5d5-467c-a687-7adfb6512cc8\") " Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.192622 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64530c6a-e5d5-467c-a687-7adfb6512cc8-kube-api-access-qsf7t" (OuterVolumeSpecName: "kube-api-access-qsf7t") pod "64530c6a-e5d5-467c-a687-7adfb6512cc8" (UID: "64530c6a-e5d5-467c-a687-7adfb6512cc8"). InnerVolumeSpecName "kube-api-access-qsf7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.277906 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsf7t\" (UniqueName: \"kubernetes.io/projected/64530c6a-e5d5-467c-a687-7adfb6512cc8-kube-api-access-qsf7t\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.804131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-542rb" event={"ID":"64530c6a-e5d5-467c-a687-7adfb6512cc8","Type":"ContainerDied","Data":"2f11708b8588bf5822954bf090fbd73677f8f3b90f9269e9db2c40778b35556a"} Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.804169 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f11708b8588bf5822954bf090fbd73677f8f3b90f9269e9db2c40778b35556a" Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.804186 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-542rb" Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.833852 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-trmqv"] Mar 18 12:36:05 crc kubenswrapper[4921]: I0318 12:36:05.839781 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-trmqv"] Mar 18 12:36:07 crc kubenswrapper[4921]: I0318 12:36:07.218856 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0507122c-6da4-423e-ab92-47829824f5de" path="/var/lib/kubelet/pods/0507122c-6da4-423e-ab92-47829824f5de/volumes" Mar 18 12:36:13 crc kubenswrapper[4921]: I0318 12:36:13.209946 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:36:13 crc kubenswrapper[4921]: E0318 12:36:13.211158 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:36:28 crc kubenswrapper[4921]: I0318 12:36:28.209419 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:36:28 crc kubenswrapper[4921]: E0318 12:36:28.210299 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:36:40 crc kubenswrapper[4921]: I0318 12:36:40.209316 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:36:40 crc kubenswrapper[4921]: E0318 12:36:40.210006 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:36:52 crc kubenswrapper[4921]: I0318 12:36:52.209440 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:36:52 crc kubenswrapper[4921]: E0318 12:36:52.210155 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.007218 4921 scope.go:117] "RemoveContainer" containerID="70abdc91b6f2fb58442ccc59e3cfb2b789054925784b519666261caf23ed88b3" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.029694 4921 scope.go:117] "RemoveContainer" containerID="7c94fca6c5293bb11d0c047033353aa8038883ef1a81a2a71b1ff8f8c52e7d06" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.057265 4921 scope.go:117] "RemoveContainer" containerID="f0948baeeda676ec306be8c3a4d0f0f3fb0ef395106571ae3905a145ee1a1119" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.085437 4921 scope.go:117] "RemoveContainer" containerID="9af29f28d1d0bacc1715100e8b5b4d78585264b9810296a3a93d8895a39127c2" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.108642 4921 scope.go:117] "RemoveContainer" containerID="4c273d1e1bd70e964314248da90f6136b916bbbe7e760714e621f8c923c98fa7" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.133917 4921 scope.go:117] "RemoveContainer" containerID="84aeae41e48cf9f3868fdc45a2a0e25ada202b804f77350b8ae8019f82805fc0" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.161080 4921 scope.go:117] "RemoveContainer" containerID="96fa5b7f7c8a41b2a620669f22f68be7614f4871e7bc352d6e8f084f65808780" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.181810 4921 scope.go:117] "RemoveContainer" containerID="89930d04b4c7bed2254497ca2fd22c3ee2be9d23228d09531a0daf68b98760cb" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.225466 4921 scope.go:117] "RemoveContainer" containerID="c0cfcab1c54f029107c9b49c527245287e48346e19a182d6826532d18b88fff2" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.258353 4921 scope.go:117] "RemoveContainer" containerID="fdd42d8f01500d855fa39f90591a9243c08485ff85370402767a972c29d33ddc" Mar 18 12:37:02 crc kubenswrapper[4921]: I0318 12:37:02.291819 4921 scope.go:117] "RemoveContainer" containerID="4eb6151343cc2ec88716d0980c3d56c2d49d23288a278634cdfaa260afdf18bd" Mar 18 12:37:03 crc kubenswrapper[4921]: I0318 12:37:03.208894 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:37:03 crc kubenswrapper[4921]: E0318 12:37:03.209175 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:37:14 crc kubenswrapper[4921]: I0318 12:37:14.210419 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:37:14 crc kubenswrapper[4921]: E0318 12:37:14.211380 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:37:28 crc kubenswrapper[4921]: I0318 12:37:28.210174 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:37:28 crc kubenswrapper[4921]: E0318 12:37:28.210692 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:37:41 crc kubenswrapper[4921]: I0318 12:37:41.213075 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:37:41 crc kubenswrapper[4921]: E0318 12:37:41.213857 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:37:54 crc kubenswrapper[4921]: I0318 12:37:54.209239 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:37:54 crc kubenswrapper[4921]: E0318 12:37:54.209865 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.146304 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563958-6jh7v"] Mar 18 12:38:00 crc kubenswrapper[4921]: E0318 12:38:00.149690 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64530c6a-e5d5-467c-a687-7adfb6512cc8" containerName="oc" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.149739 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="64530c6a-e5d5-467c-a687-7adfb6512cc8" containerName="oc" Mar 18 12:38:00 crc kubenswrapper[4921]: E0318 12:38:00.149770 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerName="mariadb-account-create-update" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.149778 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb2928a-2c88-4045-bc13-a7dca96f9639" containerName="mariadb-account-create-update" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.150064 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="64530c6a-e5d5-467c-a687-7adfb6512cc8" containerName="oc" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.150632 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.154937 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.155191 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.155306 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.162474 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-6jh7v"] Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.238309 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52m7d\" (UniqueName: \"kubernetes.io/projected/bd5c4ee5-777d-4184-8eb6-127237c985c2-kube-api-access-52m7d\") pod \"auto-csr-approver-29563958-6jh7v\" (UID: \"bd5c4ee5-777d-4184-8eb6-127237c985c2\") " pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.340236 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52m7d\" (UniqueName: \"kubernetes.io/projected/bd5c4ee5-777d-4184-8eb6-127237c985c2-kube-api-access-52m7d\") pod \"auto-csr-approver-29563958-6jh7v\" (UID: \"bd5c4ee5-777d-4184-8eb6-127237c985c2\") " pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.362917 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52m7d\" (UniqueName: \"kubernetes.io/projected/bd5c4ee5-777d-4184-8eb6-127237c985c2-kube-api-access-52m7d\") pod \"auto-csr-approver-29563958-6jh7v\" (UID: \"bd5c4ee5-777d-4184-8eb6-127237c985c2\") " pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.473211 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:00 crc kubenswrapper[4921]: I0318 12:38:00.902018 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-6jh7v"] Mar 18 12:38:01 crc kubenswrapper[4921]: I0318 12:38:01.758094 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" event={"ID":"bd5c4ee5-777d-4184-8eb6-127237c985c2","Type":"ContainerStarted","Data":"2c9f5fb98048c42760a5253daba8a3631f356539c0fbb56ad942b84cd0ec2e52"} Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.444164 4921 scope.go:117] "RemoveContainer" containerID="5a591efba1fea098e4e2ee9a5031ae5425e8b92bd58ca9d3acceb4eae64db51a" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.501853 4921 scope.go:117] "RemoveContainer" containerID="c97e73a4d065ed916e858b88a0002d0cab0e04c2e52616c161924f98fb974a12" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.543685 4921 scope.go:117] "RemoveContainer" containerID="f72fcb864229e0724c793833de3329ec5c8607f10e0f065d1a32fcdf40c91c4c" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.573336 4921 scope.go:117] "RemoveContainer" containerID="54415727870d7e61613d52c2f8e855664216756534f5319c4207a9e48b8464c6" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.595474 4921 scope.go:117] "RemoveContainer" containerID="134fe1b336feadcd0f94e532c2e60882555cb33bbe21adb7cf4ca856bed8d851" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.676620 4921 scope.go:117] "RemoveContainer" containerID="bbe5d5b664d93ae46f5d4fb46c965b28251261f3bb035ef27691c9ea1dc9e6e2" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.699564 4921 scope.go:117] "RemoveContainer" containerID="24e1e708b5358a5afd58f060787e1ff245958125b4d36a41a5a823dfeefd9233" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.748344 4921 scope.go:117] "RemoveContainer" containerID="3304fa91e8bbabf4082cc5197900d74b2b6e74d5eb12fcdb90a6a309eb2fc462" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.780706 4921 generic.go:334] "Generic (PLEG): container finished" podID="bd5c4ee5-777d-4184-8eb6-127237c985c2" containerID="f0b7ebac41983329621fa63bf65b32f3ea0dc23f34d45595ce71f687a7b580d9" exitCode=0 Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.780761 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" event={"ID":"bd5c4ee5-777d-4184-8eb6-127237c985c2","Type":"ContainerDied","Data":"f0b7ebac41983329621fa63bf65b32f3ea0dc23f34d45595ce71f687a7b580d9"} Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.788156 4921 scope.go:117] "RemoveContainer" containerID="57a6a812f70d6c791d7b79a7c6b59ceeb4376b6b25d705266a3ac24cd61d8dbb" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.810453 4921 scope.go:117] "RemoveContainer" containerID="9676f7e73a17d328518232be535bd6ebc94a2f33d2ffbee78fe37fbd93328b9f" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.830655 4921 scope.go:117] "RemoveContainer" containerID="4e883247d897ebfa3215a4f104abf963ac3aa61acf83b7b4894e7ae109b37711" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.866918 4921 scope.go:117] "RemoveContainer" containerID="7842ac3232e3505695ea73d27b40d5b85b9b18985de261336a40c67658b41cec" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.887424 4921 scope.go:117] "RemoveContainer" containerID="445eccbb89655c23ff7607f6269036df4f7a8f07bbb8c39a52a981e3a42cf9b8" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.905375 4921 scope.go:117] "RemoveContainer" containerID="25f7c772beed4682c89d2283bef0c9fe44e3030f69adccee783f77ab5066eb4b" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.926320 4921 scope.go:117] "RemoveContainer" containerID="b97374f116bf95be9e05efb89dd70d18a2c903b071e0c70977ebb9bf22f1006d" Mar 18 12:38:02 crc kubenswrapper[4921]: I0318 12:38:02.963291 4921 scope.go:117] "RemoveContainer" containerID="9ecd0296879ad17c999545066ef81fa6224bf76e4e315c77269829322f92fc94" Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.047211 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.091767 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52m7d\" (UniqueName: \"kubernetes.io/projected/bd5c4ee5-777d-4184-8eb6-127237c985c2-kube-api-access-52m7d\") pod \"bd5c4ee5-777d-4184-8eb6-127237c985c2\" (UID: \"bd5c4ee5-777d-4184-8eb6-127237c985c2\") " Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.096586 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5c4ee5-777d-4184-8eb6-127237c985c2-kube-api-access-52m7d" (OuterVolumeSpecName: "kube-api-access-52m7d") pod "bd5c4ee5-777d-4184-8eb6-127237c985c2" (UID: "bd5c4ee5-777d-4184-8eb6-127237c985c2"). InnerVolumeSpecName "kube-api-access-52m7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.196166 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52m7d\" (UniqueName: \"kubernetes.io/projected/bd5c4ee5-777d-4184-8eb6-127237c985c2-kube-api-access-52m7d\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.803064 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" event={"ID":"bd5c4ee5-777d-4184-8eb6-127237c985c2","Type":"ContainerDied","Data":"2c9f5fb98048c42760a5253daba8a3631f356539c0fbb56ad942b84cd0ec2e52"} Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.803402 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9f5fb98048c42760a5253daba8a3631f356539c0fbb56ad942b84cd0ec2e52" Mar 18 12:38:04 crc kubenswrapper[4921]: I0318 12:38:04.803229 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-6jh7v" Mar 18 12:38:05 crc kubenswrapper[4921]: I0318 12:38:05.116874 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-lfdc6"] Mar 18 12:38:05 crc kubenswrapper[4921]: I0318 12:38:05.121817 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-lfdc6"] Mar 18 12:38:05 crc kubenswrapper[4921]: I0318 12:38:05.208699 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:38:05 crc kubenswrapper[4921]: E0318 12:38:05.208990 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:38:05 crc kubenswrapper[4921]: I0318 12:38:05.217346 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d39e338-b192-4d26-b34e-06f358a643f3" path="/var/lib/kubelet/pods/1d39e338-b192-4d26-b34e-06f358a643f3/volumes" Mar 18 12:38:17 crc kubenswrapper[4921]: I0318 12:38:17.209562 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:38:17 crc kubenswrapper[4921]: E0318 12:38:17.211463 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:38:30 crc kubenswrapper[4921]: I0318 12:38:30.210360 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:38:30 crc kubenswrapper[4921]: E0318 12:38:30.212563 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:38:43 crc kubenswrapper[4921]: I0318 12:38:43.209608 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:38:43 crc kubenswrapper[4921]: E0318 12:38:43.210403 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:38:57 crc kubenswrapper[4921]: I0318 12:38:57.209132 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:38:57 crc kubenswrapper[4921]: E0318 12:38:57.209823 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.323530 4921 scope.go:117] "RemoveContainer" containerID="cac3aa7baca1c481ff2d6c82cc731c0849f2635fd4ed78439f67735301e095b3" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.351769 4921 scope.go:117] "RemoveContainer" containerID="fcd04e6ba101c226ce486909cafb42e88b51add5c7c331c1698a093284551f03" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.379752 4921 scope.go:117] "RemoveContainer" containerID="8ae61aa8c057c79dfa1422f040343df978c73fafb418d65005dcbeffb56a0b62" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.416205 4921 scope.go:117] "RemoveContainer" containerID="f4a7f718a064ff5327ef0699ff8ee174589413ea084ad26555e7c4fd891adbf4" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.437456 4921 scope.go:117] "RemoveContainer" containerID="ac7ed7ac22161ef7e0586585869b63f5c860106566f8be795e90e2efe3141fc5" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.493830 4921 scope.go:117] "RemoveContainer" containerID="f7216c7a86cd828cfa2701494b171fae9e8224c04fb41a1372f7e7cb90e5cf3a" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.512629 4921 scope.go:117] "RemoveContainer" containerID="512062659d515017ee947ace7cc917182bfc83631f3814efe78fc925e4023714" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.566270 4921 scope.go:117] "RemoveContainer" containerID="f9b427283c3c62291529dbbfa222883688637448a5c073c793a49f2cb67e24a5" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.591217 4921 scope.go:117] "RemoveContainer" containerID="02b675be8b0d3793c287ddec2ea292e0fefd05e85f65de2310afd3b338a8549c" Mar 18 12:39:03 crc kubenswrapper[4921]: I0318 12:39:03.611070 4921 scope.go:117] "RemoveContainer" containerID="8b5c984089b694c583affa983bc59e091bb6cd0500df143a65cfe2b3b9f35af5" Mar 18 12:39:09 crc kubenswrapper[4921]: I0318 12:39:09.209396 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:39:09 crc kubenswrapper[4921]: E0318 12:39:09.210082 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:39:24 crc kubenswrapper[4921]: I0318 12:39:24.209670 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:39:24 crc kubenswrapper[4921]: E0318 12:39:24.210267 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:39:36 crc kubenswrapper[4921]: I0318 12:39:36.209773 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:39:36 crc kubenswrapper[4921]: E0318 12:39:36.210784 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:39:50 crc kubenswrapper[4921]: I0318 12:39:50.209251 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:39:50 crc kubenswrapper[4921]: E0318 12:39:50.210218 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.158573 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563960-snlp4"] Mar 18 12:40:00 crc kubenswrapper[4921]: E0318 12:40:00.159522 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5c4ee5-777d-4184-8eb6-127237c985c2" containerName="oc" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.159538 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5c4ee5-777d-4184-8eb6-127237c985c2" containerName="oc" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.159696 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5c4ee5-777d-4184-8eb6-127237c985c2" containerName="oc" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.160221 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.163617 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.164240 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.164796 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.179249 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-snlp4"] Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.308418 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbt6v\" (UniqueName: \"kubernetes.io/projected/484733a9-30b5-4c90-b297-5cf18424e87f-kube-api-access-fbt6v\") pod \"auto-csr-approver-29563960-snlp4\" (UID: \"484733a9-30b5-4c90-b297-5cf18424e87f\") " pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.409863 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbt6v\" (UniqueName: \"kubernetes.io/projected/484733a9-30b5-4c90-b297-5cf18424e87f-kube-api-access-fbt6v\") pod \"auto-csr-approver-29563960-snlp4\" (UID: \"484733a9-30b5-4c90-b297-5cf18424e87f\") " pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.431616 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbt6v\" (UniqueName: \"kubernetes.io/projected/484733a9-30b5-4c90-b297-5cf18424e87f-kube-api-access-fbt6v\") pod \"auto-csr-approver-29563960-snlp4\" (UID: \"484733a9-30b5-4c90-b297-5cf18424e87f\") " pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.476870 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.699430 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-snlp4"] Mar 18 12:40:00 crc kubenswrapper[4921]: I0318 12:40:00.701021 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:40:01 crc kubenswrapper[4921]: I0318 12:40:01.635534 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-snlp4" event={"ID":"484733a9-30b5-4c90-b297-5cf18424e87f","Type":"ContainerStarted","Data":"df42d9fcc07c21d70114c2c08757c86042e863afb134d710513ffcfa451d47e5"} Mar 18 12:40:02 crc kubenswrapper[4921]: I0318 12:40:02.646199 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-snlp4" event={"ID":"484733a9-30b5-4c90-b297-5cf18424e87f","Type":"ContainerStarted","Data":"a11975923dd70f014b0bf24cb7ef0e4958d37fce2428a2372d7281c16e0be8b3"} Mar 18 12:40:02 crc kubenswrapper[4921]: I0318 12:40:02.665522 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563960-snlp4" podStartSLOduration=1.193027587 podStartE2EDuration="2.665503315s" podCreationTimestamp="2026-03-18 12:40:00 +0000 UTC" firstStartedPulling="2026-03-18 12:40:00.700470948 +0000 UTC m=+1820.250391587" lastFinishedPulling="2026-03-18 12:40:02.172946676 +0000 UTC m=+1821.722867315" observedRunningTime="2026-03-18 12:40:02.66288541 +0000 UTC m=+1822.212806059" watchObservedRunningTime="2026-03-18 12:40:02.665503315 +0000 UTC m=+1822.215423954" Mar 18 12:40:03 crc kubenswrapper[4921]: I0318 12:40:03.655654 4921 generic.go:334] "Generic (PLEG): container finished" podID="484733a9-30b5-4c90-b297-5cf18424e87f" containerID="a11975923dd70f014b0bf24cb7ef0e4958d37fce2428a2372d7281c16e0be8b3" exitCode=0 Mar 18 12:40:03 crc kubenswrapper[4921]: I0318 12:40:03.655761 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-snlp4" event={"ID":"484733a9-30b5-4c90-b297-5cf18424e87f","Type":"ContainerDied","Data":"a11975923dd70f014b0bf24cb7ef0e4958d37fce2428a2372d7281c16e0be8b3"} Mar 18 12:40:03 crc kubenswrapper[4921]: I0318 12:40:03.767900 4921 scope.go:117] "RemoveContainer" containerID="1162424b5b68d6c7d3d3885c3c665ba5e9a0ab486bfe59bb8e8945864846a899" Mar 18 12:40:03 crc kubenswrapper[4921]: I0318 12:40:03.814383 4921 scope.go:117] "RemoveContainer" containerID="e548a8515ac525a12f304eb89af42a6fa6a6e913f8703bde588e092196163ce1" Mar 18 12:40:04 crc kubenswrapper[4921]: I0318 12:40:04.489200 4921 scope.go:117] "RemoveContainer" containerID="9632959f08bc3cd646cbe6c91fa73851a145e8b01415f795c58c8ee45b1d10bb" Mar 18 12:40:04 crc kubenswrapper[4921]: I0318 12:40:04.920807 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.080019 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbt6v\" (UniqueName: \"kubernetes.io/projected/484733a9-30b5-4c90-b297-5cf18424e87f-kube-api-access-fbt6v\") pod \"484733a9-30b5-4c90-b297-5cf18424e87f\" (UID: \"484733a9-30b5-4c90-b297-5cf18424e87f\") " Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.084869 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484733a9-30b5-4c90-b297-5cf18424e87f-kube-api-access-fbt6v" (OuterVolumeSpecName: "kube-api-access-fbt6v") pod "484733a9-30b5-4c90-b297-5cf18424e87f" (UID: "484733a9-30b5-4c90-b297-5cf18424e87f"). InnerVolumeSpecName "kube-api-access-fbt6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.182187 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbt6v\" (UniqueName: \"kubernetes.io/projected/484733a9-30b5-4c90-b297-5cf18424e87f-kube-api-access-fbt6v\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.209330 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:40:05 crc kubenswrapper[4921]: E0318 12:40:05.209613 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.683657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-snlp4" event={"ID":"484733a9-30b5-4c90-b297-5cf18424e87f","Type":"ContainerDied","Data":"df42d9fcc07c21d70114c2c08757c86042e863afb134d710513ffcfa451d47e5"} Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.683702 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df42d9fcc07c21d70114c2c08757c86042e863afb134d710513ffcfa451d47e5" Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.683741 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-snlp4" Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.735515 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-2bkbb"] Mar 18 12:40:05 crc kubenswrapper[4921]: I0318 12:40:05.740795 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-2bkbb"] Mar 18 12:40:07 crc kubenswrapper[4921]: I0318 12:40:07.218430 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c18cacd-e41a-4e03-ac32-0633e90d60c1" path="/var/lib/kubelet/pods/6c18cacd-e41a-4e03-ac32-0633e90d60c1/volumes" Mar 18 12:40:18 crc kubenswrapper[4921]: I0318 12:40:18.209552 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:40:18 crc kubenswrapper[4921]: E0318 12:40:18.210773 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:40:32 crc kubenswrapper[4921]: I0318 12:40:32.208610 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:40:32 crc kubenswrapper[4921]: E0318 12:40:32.209382 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:40:47 crc kubenswrapper[4921]: I0318 12:40:47.532593 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:40:48 crc kubenswrapper[4921]: I0318 12:40:48.555331 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"ab9dd9e85306f746850577c44545c0740315aa322f67909555228d828bc165e9"} Mar 18 12:41:04 crc kubenswrapper[4921]: I0318 12:41:04.584904 4921 scope.go:117] "RemoveContainer" containerID="0bcc716a8038376b1b7b1bc7e7ffe225d7b8b38b155200403e773bbe8b8c3326" Mar 18 12:41:04 crc kubenswrapper[4921]: I0318 12:41:04.636522 4921 scope.go:117] "RemoveContainer" containerID="883b6fdd80905e59b68bd5edf1ef12d420de787a692ba232378312e1af55dfbd" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.716650 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7xrv8"] Mar 18 12:41:35 crc kubenswrapper[4921]: E0318 12:41:35.718042 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484733a9-30b5-4c90-b297-5cf18424e87f" containerName="oc" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.718065 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="484733a9-30b5-4c90-b297-5cf18424e87f" containerName="oc" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.718437 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="484733a9-30b5-4c90-b297-5cf18424e87f" containerName="oc" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.720015 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.735711 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xrv8"] Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.806775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-catalog-content\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.806839 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bq4\" (UniqueName: \"kubernetes.io/projected/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-kube-api-access-48bq4\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.806870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-utilities\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.908201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-catalog-content\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.908287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bq4\" (UniqueName: \"kubernetes.io/projected/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-kube-api-access-48bq4\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.908328 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-utilities\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.908798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-catalog-content\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.908835 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-utilities\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:35 crc kubenswrapper[4921]: I0318 12:41:35.929177 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bq4\" (UniqueName: \"kubernetes.io/projected/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-kube-api-access-48bq4\") pod \"redhat-operators-7xrv8\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:36 crc kubenswrapper[4921]: I0318 12:41:36.047029 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:36 crc kubenswrapper[4921]: I0318 12:41:36.473074 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7xrv8"] Mar 18 12:41:36 crc kubenswrapper[4921]: I0318 12:41:36.934184 4921 generic.go:334] "Generic (PLEG): container finished" podID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerID="b64e6fe451a0022343b9bcd38fef276441e1ff05637146dc58c2921955e2dff8" exitCode=0 Mar 18 12:41:36 crc kubenswrapper[4921]: I0318 12:41:36.934309 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerDied","Data":"b64e6fe451a0022343b9bcd38fef276441e1ff05637146dc58c2921955e2dff8"} Mar 18 12:41:36 crc kubenswrapper[4921]: I0318 12:41:36.934731 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerStarted","Data":"cb9778e13a33fbca7b13dbdecbcb0ea9d6c32e9fc00e1c85bdd251af6dbb984a"} Mar 18 12:41:37 crc kubenswrapper[4921]: I0318 12:41:37.943129 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerStarted","Data":"f445092268979fb8bade7a0091390c5087300c342620e7187f2b2b10380f8dfc"} Mar 18 12:41:38 crc kubenswrapper[4921]: I0318 12:41:38.952311 4921 generic.go:334] "Generic (PLEG): container finished" podID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerID="f445092268979fb8bade7a0091390c5087300c342620e7187f2b2b10380f8dfc" exitCode=0 Mar 18 12:41:38 crc kubenswrapper[4921]: I0318 12:41:38.952382 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerDied","Data":"f445092268979fb8bade7a0091390c5087300c342620e7187f2b2b10380f8dfc"} Mar 18 12:41:39 crc kubenswrapper[4921]: I0318 12:41:39.960638 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerStarted","Data":"d804aec0d312f1d245aa11d6e97072396fb4e3c870e8ed4ffb0144c1dc5723c8"} Mar 18 12:41:39 crc kubenswrapper[4921]: I0318 12:41:39.982630 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7xrv8" podStartSLOduration=2.546388592 podStartE2EDuration="4.982606135s" podCreationTimestamp="2026-03-18 12:41:35 +0000 UTC" firstStartedPulling="2026-03-18 12:41:36.936021651 +0000 UTC m=+1916.485942290" lastFinishedPulling="2026-03-18 12:41:39.372239194 +0000 UTC m=+1918.922159833" observedRunningTime="2026-03-18 12:41:39.977205032 +0000 UTC m=+1919.527125671" watchObservedRunningTime="2026-03-18 12:41:39.982606135 +0000 UTC m=+1919.532526764" Mar 18 12:41:46 crc kubenswrapper[4921]: I0318 12:41:46.047993 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:46 crc kubenswrapper[4921]: I0318 12:41:46.048640 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:46 crc kubenswrapper[4921]: I0318 12:41:46.100918 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:47 crc kubenswrapper[4921]: I0318 12:41:47.063973 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:47 crc kubenswrapper[4921]: I0318 12:41:47.114623 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7xrv8"] Mar 18 12:41:49 crc kubenswrapper[4921]: I0318 12:41:49.041933 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7xrv8" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="registry-server" containerID="cri-o://d804aec0d312f1d245aa11d6e97072396fb4e3c870e8ed4ffb0144c1dc5723c8" gracePeriod=2 Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.059922 4921 generic.go:334] "Generic (PLEG): container finished" podID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerID="d804aec0d312f1d245aa11d6e97072396fb4e3c870e8ed4ffb0144c1dc5723c8" exitCode=0 Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.060190 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerDied","Data":"d804aec0d312f1d245aa11d6e97072396fb4e3c870e8ed4ffb0144c1dc5723c8"} Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.377045 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.468811 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-catalog-content\") pod \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.469058 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-utilities\") pod \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.469203 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48bq4\" (UniqueName: \"kubernetes.io/projected/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-kube-api-access-48bq4\") pod \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\" (UID: \"6f9e719c-20d6-4880-aa9d-8d8ee03596cb\") " Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.470047 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-utilities" (OuterVolumeSpecName: "utilities") pod "6f9e719c-20d6-4880-aa9d-8d8ee03596cb" (UID: "6f9e719c-20d6-4880-aa9d-8d8ee03596cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.480587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-kube-api-access-48bq4" (OuterVolumeSpecName: "kube-api-access-48bq4") pod "6f9e719c-20d6-4880-aa9d-8d8ee03596cb" (UID: "6f9e719c-20d6-4880-aa9d-8d8ee03596cb"). InnerVolumeSpecName "kube-api-access-48bq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.571622 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.571671 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48bq4\" (UniqueName: \"kubernetes.io/projected/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-kube-api-access-48bq4\") on node \"crc\" DevicePath \"\"" Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.635232 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f9e719c-20d6-4880-aa9d-8d8ee03596cb" (UID: "6f9e719c-20d6-4880-aa9d-8d8ee03596cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:41:51 crc kubenswrapper[4921]: I0318 12:41:51.673503 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9e719c-20d6-4880-aa9d-8d8ee03596cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.069318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7xrv8" event={"ID":"6f9e719c-20d6-4880-aa9d-8d8ee03596cb","Type":"ContainerDied","Data":"cb9778e13a33fbca7b13dbdecbcb0ea9d6c32e9fc00e1c85bdd251af6dbb984a"} Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.069430 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7xrv8" Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.069688 4921 scope.go:117] "RemoveContainer" containerID="d804aec0d312f1d245aa11d6e97072396fb4e3c870e8ed4ffb0144c1dc5723c8" Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.086965 4921 scope.go:117] "RemoveContainer" containerID="f445092268979fb8bade7a0091390c5087300c342620e7187f2b2b10380f8dfc" Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.106272 4921 scope.go:117] "RemoveContainer" containerID="b64e6fe451a0022343b9bcd38fef276441e1ff05637146dc58c2921955e2dff8" Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.130328 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7xrv8"] Mar 18 12:41:52 crc kubenswrapper[4921]: I0318 12:41:52.139086 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7xrv8"] Mar 18 12:41:53 crc kubenswrapper[4921]: I0318 12:41:53.220097 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" path="/var/lib/kubelet/pods/6f9e719c-20d6-4880-aa9d-8d8ee03596cb/volumes" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.154125 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563962-rvxsh"] Mar 18 12:42:00 crc kubenswrapper[4921]: E0318 12:42:00.155131 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="extract-content" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.155151 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="extract-content" Mar 18 12:42:00 crc kubenswrapper[4921]: E0318 12:42:00.155165 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="registry-server" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.155173 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="registry-server" Mar 18 12:42:00 crc kubenswrapper[4921]: E0318 12:42:00.155206 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="extract-utilities" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.155218 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="extract-utilities" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.155394 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9e719c-20d6-4880-aa9d-8d8ee03596cb" containerName="registry-server" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.155910 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.159087 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.159249 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.159268 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.163702 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-rvxsh"] Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.305589 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9bzk\" (UniqueName: \"kubernetes.io/projected/1ef7c256-14a8-49b3-b3a0-d6817580141c-kube-api-access-j9bzk\") pod \"auto-csr-approver-29563962-rvxsh\" (UID: \"1ef7c256-14a8-49b3-b3a0-d6817580141c\") " pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.407213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9bzk\" (UniqueName: \"kubernetes.io/projected/1ef7c256-14a8-49b3-b3a0-d6817580141c-kube-api-access-j9bzk\") pod \"auto-csr-approver-29563962-rvxsh\" (UID: \"1ef7c256-14a8-49b3-b3a0-d6817580141c\") " pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.432970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9bzk\" (UniqueName: \"kubernetes.io/projected/1ef7c256-14a8-49b3-b3a0-d6817580141c-kube-api-access-j9bzk\") pod \"auto-csr-approver-29563962-rvxsh\" (UID: \"1ef7c256-14a8-49b3-b3a0-d6817580141c\") " pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.473849 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:00 crc kubenswrapper[4921]: I0318 12:42:00.909591 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-rvxsh"] Mar 18 12:42:01 crc kubenswrapper[4921]: I0318 12:42:01.146924 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" event={"ID":"1ef7c256-14a8-49b3-b3a0-d6817580141c","Type":"ContainerStarted","Data":"7937fb5a0912ff6d4d93c786df465703c8bd8b12691eabddcb1c94403eb08d1a"} Mar 18 12:42:03 crc kubenswrapper[4921]: I0318 12:42:03.162580 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ef7c256-14a8-49b3-b3a0-d6817580141c" containerID="8cbd3f99f290ed62aef30fa0945393e2eaac2000b989ab478144a662606e991b" exitCode=0 Mar 18 12:42:03 crc kubenswrapper[4921]: I0318 12:42:03.162647 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" event={"ID":"1ef7c256-14a8-49b3-b3a0-d6817580141c","Type":"ContainerDied","Data":"8cbd3f99f290ed62aef30fa0945393e2eaac2000b989ab478144a662606e991b"} Mar 18 12:42:04 crc kubenswrapper[4921]: I0318 12:42:04.444393 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:04 crc kubenswrapper[4921]: I0318 12:42:04.579083 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9bzk\" (UniqueName: \"kubernetes.io/projected/1ef7c256-14a8-49b3-b3a0-d6817580141c-kube-api-access-j9bzk\") pod \"1ef7c256-14a8-49b3-b3a0-d6817580141c\" (UID: \"1ef7c256-14a8-49b3-b3a0-d6817580141c\") " Mar 18 12:42:04 crc kubenswrapper[4921]: I0318 12:42:04.584438 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef7c256-14a8-49b3-b3a0-d6817580141c-kube-api-access-j9bzk" (OuterVolumeSpecName: "kube-api-access-j9bzk") pod "1ef7c256-14a8-49b3-b3a0-d6817580141c" (UID: "1ef7c256-14a8-49b3-b3a0-d6817580141c"). InnerVolumeSpecName "kube-api-access-j9bzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:04 crc kubenswrapper[4921]: I0318 12:42:04.680728 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9bzk\" (UniqueName: \"kubernetes.io/projected/1ef7c256-14a8-49b3-b3a0-d6817580141c-kube-api-access-j9bzk\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:05 crc kubenswrapper[4921]: I0318 12:42:05.202689 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" event={"ID":"1ef7c256-14a8-49b3-b3a0-d6817580141c","Type":"ContainerDied","Data":"7937fb5a0912ff6d4d93c786df465703c8bd8b12691eabddcb1c94403eb08d1a"} Mar 18 12:42:05 crc kubenswrapper[4921]: I0318 12:42:05.202731 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7937fb5a0912ff6d4d93c786df465703c8bd8b12691eabddcb1c94403eb08d1a" Mar 18 12:42:05 crc kubenswrapper[4921]: I0318 12:42:05.202764 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-rvxsh" Mar 18 12:42:05 crc kubenswrapper[4921]: I0318 12:42:05.517619 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-542rb"] Mar 18 12:42:05 crc kubenswrapper[4921]: I0318 12:42:05.528071 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-542rb"] Mar 18 12:42:07 crc kubenswrapper[4921]: I0318 12:42:07.219754 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64530c6a-e5d5-467c-a687-7adfb6512cc8" path="/var/lib/kubelet/pods/64530c6a-e5d5-467c-a687-7adfb6512cc8/volumes" Mar 18 12:42:47 crc kubenswrapper[4921]: I0318 12:42:47.081288 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:42:47 crc kubenswrapper[4921]: I0318 12:42:47.081833 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:43:04 crc kubenswrapper[4921]: I0318 12:43:04.741760 4921 scope.go:117] "RemoveContainer" containerID="c7b9f767f551f95ec6b587b136e37db572f8e86c44fc510b665a1a545c0a7870" Mar 18 12:43:17 crc kubenswrapper[4921]: I0318 12:43:17.080988 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:43:17 crc kubenswrapper[4921]: I0318 12:43:17.082319 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.080835 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.081445 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.081488 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.082063 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab9dd9e85306f746850577c44545c0740315aa322f67909555228d828bc165e9"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.082140 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://ab9dd9e85306f746850577c44545c0740315aa322f67909555228d828bc165e9" gracePeriod=600 Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.909670 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="ab9dd9e85306f746850577c44545c0740315aa322f67909555228d828bc165e9" exitCode=0 Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.909743 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"ab9dd9e85306f746850577c44545c0740315aa322f67909555228d828bc165e9"} Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.910064 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5"} Mar 18 12:43:47 crc kubenswrapper[4921]: I0318 12:43:47.910093 4921 scope.go:117] "RemoveContainer" containerID="8bf4a93aabfa255f323ed975c62b76feb22fd0a08317a9e1b78ab34c462aa9fd" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.153085 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563964-wndmf"] Mar 18 12:44:00 crc kubenswrapper[4921]: E0318 12:44:00.154278 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef7c256-14a8-49b3-b3a0-d6817580141c" containerName="oc" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.154299 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef7c256-14a8-49b3-b3a0-d6817580141c" containerName="oc" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.154482 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef7c256-14a8-49b3-b3a0-d6817580141c" containerName="oc" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.155068 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.157647 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.157971 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.158536 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.171641 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-wndmf"] Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.224722 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mfp\" (UniqueName: \"kubernetes.io/projected/9e194df3-e74a-4fd0-860b-39649533a25f-kube-api-access-p4mfp\") pod \"auto-csr-approver-29563964-wndmf\" (UID: \"9e194df3-e74a-4fd0-860b-39649533a25f\") " pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.326127 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mfp\" (UniqueName: \"kubernetes.io/projected/9e194df3-e74a-4fd0-860b-39649533a25f-kube-api-access-p4mfp\") pod \"auto-csr-approver-29563964-wndmf\" (UID: \"9e194df3-e74a-4fd0-860b-39649533a25f\") " pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.345198 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mfp\" (UniqueName: \"kubernetes.io/projected/9e194df3-e74a-4fd0-860b-39649533a25f-kube-api-access-p4mfp\") pod \"auto-csr-approver-29563964-wndmf\" (UID: \"9e194df3-e74a-4fd0-860b-39649533a25f\") " pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.483066 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:00 crc kubenswrapper[4921]: I0318 12:44:00.721291 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-wndmf"] Mar 18 12:44:01 crc kubenswrapper[4921]: I0318 12:44:01.019036 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-wndmf" event={"ID":"9e194df3-e74a-4fd0-860b-39649533a25f","Type":"ContainerStarted","Data":"052b174160bed2c1a383928b5a413fdfc8f3846ce42d01043a66c537ea0c6b82"} Mar 18 12:44:02 crc kubenswrapper[4921]: I0318 12:44:02.027443 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-wndmf" event={"ID":"9e194df3-e74a-4fd0-860b-39649533a25f","Type":"ContainerStarted","Data":"83785af6effad1302d3e450280bf778f94dfc98dbd998c687294191289cd630b"} Mar 18 12:44:02 crc kubenswrapper[4921]: I0318 12:44:02.041612 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563964-wndmf" podStartSLOduration=1.121465171 podStartE2EDuration="2.041590451s" podCreationTimestamp="2026-03-18 12:44:00 +0000 UTC" firstStartedPulling="2026-03-18 12:44:00.729476378 +0000 UTC m=+2060.279397017" lastFinishedPulling="2026-03-18 12:44:01.649601658 +0000 UTC m=+2061.199522297" observedRunningTime="2026-03-18 12:44:02.040009757 +0000 UTC m=+2061.589930406" watchObservedRunningTime="2026-03-18 12:44:02.041590451 +0000 UTC m=+2061.591511090" Mar 18 12:44:03 crc kubenswrapper[4921]: I0318 12:44:03.035685 4921 generic.go:334] "Generic (PLEG): container finished" podID="9e194df3-e74a-4fd0-860b-39649533a25f" containerID="83785af6effad1302d3e450280bf778f94dfc98dbd998c687294191289cd630b" exitCode=0 Mar 18 12:44:03 crc kubenswrapper[4921]: I0318 12:44:03.035742 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-wndmf" event={"ID":"9e194df3-e74a-4fd0-860b-39649533a25f","Type":"ContainerDied","Data":"83785af6effad1302d3e450280bf778f94dfc98dbd998c687294191289cd630b"} Mar 18 12:44:04 crc kubenswrapper[4921]: I0318 12:44:04.566161 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:04 crc kubenswrapper[4921]: I0318 12:44:04.738298 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4mfp\" (UniqueName: \"kubernetes.io/projected/9e194df3-e74a-4fd0-860b-39649533a25f-kube-api-access-p4mfp\") pod \"9e194df3-e74a-4fd0-860b-39649533a25f\" (UID: \"9e194df3-e74a-4fd0-860b-39649533a25f\") " Mar 18 12:44:04 crc kubenswrapper[4921]: I0318 12:44:04.742656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e194df3-e74a-4fd0-860b-39649533a25f-kube-api-access-p4mfp" (OuterVolumeSpecName: "kube-api-access-p4mfp") pod "9e194df3-e74a-4fd0-860b-39649533a25f" (UID: "9e194df3-e74a-4fd0-860b-39649533a25f"). InnerVolumeSpecName "kube-api-access-p4mfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:44:04 crc kubenswrapper[4921]: I0318 12:44:04.840181 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4mfp\" (UniqueName: \"kubernetes.io/projected/9e194df3-e74a-4fd0-860b-39649533a25f-kube-api-access-p4mfp\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:05 crc kubenswrapper[4921]: I0318 12:44:05.051727 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-wndmf" event={"ID":"9e194df3-e74a-4fd0-860b-39649533a25f","Type":"ContainerDied","Data":"052b174160bed2c1a383928b5a413fdfc8f3846ce42d01043a66c537ea0c6b82"} Mar 18 12:44:05 crc kubenswrapper[4921]: I0318 12:44:05.051767 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="052b174160bed2c1a383928b5a413fdfc8f3846ce42d01043a66c537ea0c6b82" Mar 18 12:44:05 crc kubenswrapper[4921]: I0318 12:44:05.052529 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-wndmf" Mar 18 12:44:05 crc kubenswrapper[4921]: I0318 12:44:05.637623 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-6jh7v"] Mar 18 12:44:05 crc kubenswrapper[4921]: I0318 12:44:05.643744 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-6jh7v"] Mar 18 12:44:07 crc kubenswrapper[4921]: I0318 12:44:07.217774 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5c4ee5-777d-4184-8eb6-127237c985c2" path="/var/lib/kubelet/pods/bd5c4ee5-777d-4184-8eb6-127237c985c2/volumes" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.154471 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5"] Mar 18 12:45:00 crc kubenswrapper[4921]: E0318 12:45:00.155339 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e194df3-e74a-4fd0-860b-39649533a25f" containerName="oc" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.155358 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e194df3-e74a-4fd0-860b-39649533a25f" containerName="oc" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.155523 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e194df3-e74a-4fd0-860b-39649533a25f" containerName="oc" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.156031 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.158392 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.159014 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.171611 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5"] Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.300415 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pv6t\" (UniqueName: \"kubernetes.io/projected/148f26a3-3a38-4c59-b627-ed51585387fe-kube-api-access-8pv6t\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.300814 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/148f26a3-3a38-4c59-b627-ed51585387fe-config-volume\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.300916 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/148f26a3-3a38-4c59-b627-ed51585387fe-secret-volume\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.402569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/148f26a3-3a38-4c59-b627-ed51585387fe-secret-volume\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.402703 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pv6t\" (UniqueName: \"kubernetes.io/projected/148f26a3-3a38-4c59-b627-ed51585387fe-kube-api-access-8pv6t\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.402763 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/148f26a3-3a38-4c59-b627-ed51585387fe-config-volume\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.404533 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/148f26a3-3a38-4c59-b627-ed51585387fe-config-volume\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.410702 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/148f26a3-3a38-4c59-b627-ed51585387fe-secret-volume\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.418837 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pv6t\" (UniqueName: \"kubernetes.io/projected/148f26a3-3a38-4c59-b627-ed51585387fe-kube-api-access-8pv6t\") pod \"collect-profiles-29563965-2ntx5\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.475906 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:00 crc kubenswrapper[4921]: I0318 12:45:00.878572 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5"] Mar 18 12:45:01 crc kubenswrapper[4921]: I0318 12:45:01.422681 4921 generic.go:334] "Generic (PLEG): container finished" podID="148f26a3-3a38-4c59-b627-ed51585387fe" containerID="bf94485548949f36d9248c8f52d899bdc9cb04baa9a98f85413de2c3d0446a8f" exitCode=0 Mar 18 12:45:01 crc kubenswrapper[4921]: I0318 12:45:01.422748 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" event={"ID":"148f26a3-3a38-4c59-b627-ed51585387fe","Type":"ContainerDied","Data":"bf94485548949f36d9248c8f52d899bdc9cb04baa9a98f85413de2c3d0446a8f"} Mar 18 12:45:01 crc kubenswrapper[4921]: I0318 12:45:01.422798 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" event={"ID":"148f26a3-3a38-4c59-b627-ed51585387fe","Type":"ContainerStarted","Data":"dba0ae59f9a21f1847bccf6612b84e10594df61bd3084024f4fa05b37cafaf86"} Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.693628 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.837082 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/148f26a3-3a38-4c59-b627-ed51585387fe-secret-volume\") pod \"148f26a3-3a38-4c59-b627-ed51585387fe\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.837263 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/148f26a3-3a38-4c59-b627-ed51585387fe-config-volume\") pod \"148f26a3-3a38-4c59-b627-ed51585387fe\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.837346 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pv6t\" (UniqueName: \"kubernetes.io/projected/148f26a3-3a38-4c59-b627-ed51585387fe-kube-api-access-8pv6t\") pod \"148f26a3-3a38-4c59-b627-ed51585387fe\" (UID: \"148f26a3-3a38-4c59-b627-ed51585387fe\") " Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.838424 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148f26a3-3a38-4c59-b627-ed51585387fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "148f26a3-3a38-4c59-b627-ed51585387fe" (UID: "148f26a3-3a38-4c59-b627-ed51585387fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.843656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148f26a3-3a38-4c59-b627-ed51585387fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "148f26a3-3a38-4c59-b627-ed51585387fe" (UID: "148f26a3-3a38-4c59-b627-ed51585387fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.843664 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148f26a3-3a38-4c59-b627-ed51585387fe-kube-api-access-8pv6t" (OuterVolumeSpecName: "kube-api-access-8pv6t") pod "148f26a3-3a38-4c59-b627-ed51585387fe" (UID: "148f26a3-3a38-4c59-b627-ed51585387fe"). InnerVolumeSpecName "kube-api-access-8pv6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.938795 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/148f26a3-3a38-4c59-b627-ed51585387fe-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.938851 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pv6t\" (UniqueName: \"kubernetes.io/projected/148f26a3-3a38-4c59-b627-ed51585387fe-kube-api-access-8pv6t\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:02 crc kubenswrapper[4921]: I0318 12:45:02.938865 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/148f26a3-3a38-4c59-b627-ed51585387fe-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4921]: I0318 12:45:03.440651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" event={"ID":"148f26a3-3a38-4c59-b627-ed51585387fe","Type":"ContainerDied","Data":"dba0ae59f9a21f1847bccf6612b84e10594df61bd3084024f4fa05b37cafaf86"} Mar 18 12:45:03 crc kubenswrapper[4921]: I0318 12:45:03.441138 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dba0ae59f9a21f1847bccf6612b84e10594df61bd3084024f4fa05b37cafaf86" Mar 18 12:45:03 crc kubenswrapper[4921]: I0318 12:45:03.440884 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5" Mar 18 12:45:03 crc kubenswrapper[4921]: I0318 12:45:03.769321 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n"] Mar 18 12:45:03 crc kubenswrapper[4921]: I0318 12:45:03.776460 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-2bn6n"] Mar 18 12:45:04 crc kubenswrapper[4921]: I0318 12:45:04.813545 4921 scope.go:117] "RemoveContainer" containerID="f0b7ebac41983329621fa63bf65b32f3ea0dc23f34d45595ce71f687a7b580d9" Mar 18 12:45:05 crc kubenswrapper[4921]: I0318 12:45:05.229988 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb744db6-3732-400d-8939-2577d28e7cd5" path="/var/lib/kubelet/pods/fb744db6-3732-400d-8939-2577d28e7cd5/volumes" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.937908 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xw56p"] Mar 18 12:45:26 crc kubenswrapper[4921]: E0318 12:45:26.939562 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148f26a3-3a38-4c59-b627-ed51585387fe" containerName="collect-profiles" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.939582 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="148f26a3-3a38-4c59-b627-ed51585387fe" containerName="collect-profiles" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.939774 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="148f26a3-3a38-4c59-b627-ed51585387fe" containerName="collect-profiles" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.941275 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.951077 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xw56p"] Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.987752 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9xf\" (UniqueName: \"kubernetes.io/projected/dccaf593-ad76-49b3-8720-4cc69f07c610-kube-api-access-rz9xf\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.988075 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-utilities\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:26 crc kubenswrapper[4921]: I0318 12:45:26.988618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-catalog-content\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.089436 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9xf\" (UniqueName: \"kubernetes.io/projected/dccaf593-ad76-49b3-8720-4cc69f07c610-kube-api-access-rz9xf\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.089502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-utilities\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.089540 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-catalog-content\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.090024 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-utilities\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.090089 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-catalog-content\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.110171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9xf\" (UniqueName: \"kubernetes.io/projected/dccaf593-ad76-49b3-8720-4cc69f07c610-kube-api-access-rz9xf\") pod \"certified-operators-xw56p\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.259668 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:27 crc kubenswrapper[4921]: I0318 12:45:27.710950 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xw56p"] Mar 18 12:45:28 crc kubenswrapper[4921]: I0318 12:45:28.636444 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xw56p" event={"ID":"dccaf593-ad76-49b3-8720-4cc69f07c610","Type":"ContainerDied","Data":"b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5"} Mar 18 12:45:28 crc kubenswrapper[4921]: I0318 12:45:28.636286 4921 generic.go:334] "Generic (PLEG): container finished" podID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerID="b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5" exitCode=0 Mar 18 12:45:28 crc kubenswrapper[4921]: I0318 12:45:28.637325 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xw56p" event={"ID":"dccaf593-ad76-49b3-8720-4cc69f07c610","Type":"ContainerStarted","Data":"7a5035b28ca98ae0eeeaf85c02568516ff1eabad568942ca542ffee736f509c9"} Mar 18 12:45:28 crc kubenswrapper[4921]: I0318 12:45:28.638941 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:45:30 crc kubenswrapper[4921]: I0318 12:45:30.672431 4921 generic.go:334] "Generic (PLEG): container finished" podID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerID="6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51" exitCode=0 Mar 18 12:45:30 crc kubenswrapper[4921]: I0318 12:45:30.672924 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xw56p" event={"ID":"dccaf593-ad76-49b3-8720-4cc69f07c610","Type":"ContainerDied","Data":"6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51"} Mar 18 12:45:31 crc kubenswrapper[4921]: I0318 12:45:31.681938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xw56p" event={"ID":"dccaf593-ad76-49b3-8720-4cc69f07c610","Type":"ContainerStarted","Data":"24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748"} Mar 18 12:45:31 crc kubenswrapper[4921]: I0318 12:45:31.707560 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xw56p" podStartSLOduration=3.2476471670000002 podStartE2EDuration="5.707532789s" podCreationTimestamp="2026-03-18 12:45:26 +0000 UTC" firstStartedPulling="2026-03-18 12:45:28.638634635 +0000 UTC m=+2148.188555274" lastFinishedPulling="2026-03-18 12:45:31.098520257 +0000 UTC m=+2150.648440896" observedRunningTime="2026-03-18 12:45:31.699074649 +0000 UTC m=+2151.248995298" watchObservedRunningTime="2026-03-18 12:45:31.707532789 +0000 UTC m=+2151.257453438" Mar 18 12:45:37 crc kubenswrapper[4921]: I0318 12:45:37.260579 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:37 crc kubenswrapper[4921]: I0318 12:45:37.261190 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:37 crc kubenswrapper[4921]: I0318 12:45:37.317761 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:37 crc kubenswrapper[4921]: I0318 12:45:37.762425 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:37 crc kubenswrapper[4921]: I0318 12:45:37.805016 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xw56p"] Mar 18 12:45:39 crc kubenswrapper[4921]: I0318 12:45:39.736856 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xw56p" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="registry-server" containerID="cri-o://24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748" gracePeriod=2 Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.221237 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.288309 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-utilities\") pod \"dccaf593-ad76-49b3-8720-4cc69f07c610\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.288622 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz9xf\" (UniqueName: \"kubernetes.io/projected/dccaf593-ad76-49b3-8720-4cc69f07c610-kube-api-access-rz9xf\") pod \"dccaf593-ad76-49b3-8720-4cc69f07c610\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.288719 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-catalog-content\") pod \"dccaf593-ad76-49b3-8720-4cc69f07c610\" (UID: \"dccaf593-ad76-49b3-8720-4cc69f07c610\") " Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.289328 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-utilities" (OuterVolumeSpecName: "utilities") pod "dccaf593-ad76-49b3-8720-4cc69f07c610" (UID: "dccaf593-ad76-49b3-8720-4cc69f07c610"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.295680 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dccaf593-ad76-49b3-8720-4cc69f07c610-kube-api-access-rz9xf" (OuterVolumeSpecName: "kube-api-access-rz9xf") pod "dccaf593-ad76-49b3-8720-4cc69f07c610" (UID: "dccaf593-ad76-49b3-8720-4cc69f07c610"). InnerVolumeSpecName "kube-api-access-rz9xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.390314 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.390348 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz9xf\" (UniqueName: \"kubernetes.io/projected/dccaf593-ad76-49b3-8720-4cc69f07c610-kube-api-access-rz9xf\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.401422 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dccaf593-ad76-49b3-8720-4cc69f07c610" (UID: "dccaf593-ad76-49b3-8720-4cc69f07c610"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.491480 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dccaf593-ad76-49b3-8720-4cc69f07c610-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.746203 4921 generic.go:334] "Generic (PLEG): container finished" podID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerID="24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748" exitCode=0 Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.746250 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xw56p" event={"ID":"dccaf593-ad76-49b3-8720-4cc69f07c610","Type":"ContainerDied","Data":"24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748"} Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.746283 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xw56p" event={"ID":"dccaf593-ad76-49b3-8720-4cc69f07c610","Type":"ContainerDied","Data":"7a5035b28ca98ae0eeeaf85c02568516ff1eabad568942ca542ffee736f509c9"} Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.746301 4921 scope.go:117] "RemoveContainer" containerID="24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.746467 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xw56p" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.764881 4921 scope.go:117] "RemoveContainer" containerID="6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.778301 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xw56p"] Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.787944 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xw56p"] Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.798548 4921 scope.go:117] "RemoveContainer" containerID="b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.825932 4921 scope.go:117] "RemoveContainer" containerID="24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748" Mar 18 12:45:40 crc kubenswrapper[4921]: E0318 12:45:40.826505 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748\": container with ID starting with 24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748 not found: ID does not exist" containerID="24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.826534 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748"} err="failed to get container status \"24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748\": rpc error: code = NotFound desc = could not find container \"24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748\": container with ID starting with 24a9085b931ccfe0be13404e431410efe304d9be97da8c9fb11be30ee09ab748 not found: ID does not exist" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.826555 4921 scope.go:117] "RemoveContainer" containerID="6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51" Mar 18 12:45:40 crc kubenswrapper[4921]: E0318 12:45:40.827012 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51\": container with ID starting with 6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51 not found: ID does not exist" containerID="6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.827054 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51"} err="failed to get container status \"6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51\": rpc error: code = NotFound desc = could not find container \"6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51\": container with ID starting with 6bcb4373688df3689a089c6d59899188be3f413f5197ed8da2d071396fcf4b51 not found: ID does not exist" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.827084 4921 scope.go:117] "RemoveContainer" containerID="b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5" Mar 18 12:45:40 crc kubenswrapper[4921]: E0318 12:45:40.827632 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5\": container with ID starting with b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5 not found: ID does not exist" containerID="b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5" Mar 18 12:45:40 crc kubenswrapper[4921]: I0318 12:45:40.827673 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5"} err="failed to get container status \"b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5\": rpc error: code = NotFound desc = could not find container \"b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5\": container with ID starting with b04e0e25245f493de10fcb08a060dbb22078bda2c5a80934ca63e0797c2b0cc5 not found: ID does not exist" Mar 18 12:45:41 crc kubenswrapper[4921]: I0318 12:45:41.219134 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" path="/var/lib/kubelet/pods/dccaf593-ad76-49b3-8720-4cc69f07c610/volumes" Mar 18 12:45:47 crc kubenswrapper[4921]: I0318 12:45:47.081125 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:45:47 crc kubenswrapper[4921]: I0318 12:45:47.082632 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.145090 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563966-mnsdl"] Mar 18 12:46:00 crc kubenswrapper[4921]: E0318 12:46:00.145944 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="extract-content" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.145962 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="extract-content" Mar 18 12:46:00 crc kubenswrapper[4921]: E0318 12:46:00.145977 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="extract-utilities" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.145986 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="extract-utilities" Mar 18 12:46:00 crc kubenswrapper[4921]: E0318 12:46:00.146009 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="registry-server" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.146018 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="registry-server" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.146223 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dccaf593-ad76-49b3-8720-4cc69f07c610" containerName="registry-server" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.146838 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.152517 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.152858 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.153044 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.156334 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-mnsdl"] Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.274997 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9644\" (UniqueName: \"kubernetes.io/projected/08b91825-6eb2-4029-969f-97b8d045cfd3-kube-api-access-x9644\") pod \"auto-csr-approver-29563966-mnsdl\" (UID: \"08b91825-6eb2-4029-969f-97b8d045cfd3\") " pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.376153 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9644\" (UniqueName: \"kubernetes.io/projected/08b91825-6eb2-4029-969f-97b8d045cfd3-kube-api-access-x9644\") pod \"auto-csr-approver-29563966-mnsdl\" (UID: \"08b91825-6eb2-4029-969f-97b8d045cfd3\") " pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.398598 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9644\" (UniqueName: \"kubernetes.io/projected/08b91825-6eb2-4029-969f-97b8d045cfd3-kube-api-access-x9644\") pod \"auto-csr-approver-29563966-mnsdl\" (UID: \"08b91825-6eb2-4029-969f-97b8d045cfd3\") " pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.464029 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:00 crc kubenswrapper[4921]: I0318 12:46:00.895175 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-mnsdl"] Mar 18 12:46:01 crc kubenswrapper[4921]: I0318 12:46:01.909949 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" event={"ID":"08b91825-6eb2-4029-969f-97b8d045cfd3","Type":"ContainerStarted","Data":"45914f19d00b4ebbb505261701374fe99d3affb4403709c79368b0700abc2411"} Mar 18 12:46:02 crc kubenswrapper[4921]: I0318 12:46:02.920607 4921 generic.go:334] "Generic (PLEG): container finished" podID="08b91825-6eb2-4029-969f-97b8d045cfd3" containerID="7ef892b3ed6dd56721ee26374104acb5290bcdb706a30e36de71a1ce1430b113" exitCode=0 Mar 18 12:46:02 crc kubenswrapper[4921]: I0318 12:46:02.920664 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" event={"ID":"08b91825-6eb2-4029-969f-97b8d045cfd3","Type":"ContainerDied","Data":"7ef892b3ed6dd56721ee26374104acb5290bcdb706a30e36de71a1ce1430b113"} Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.190020 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.331194 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9644\" (UniqueName: \"kubernetes.io/projected/08b91825-6eb2-4029-969f-97b8d045cfd3-kube-api-access-x9644\") pod \"08b91825-6eb2-4029-969f-97b8d045cfd3\" (UID: \"08b91825-6eb2-4029-969f-97b8d045cfd3\") " Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.340407 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b91825-6eb2-4029-969f-97b8d045cfd3-kube-api-access-x9644" (OuterVolumeSpecName: "kube-api-access-x9644") pod "08b91825-6eb2-4029-969f-97b8d045cfd3" (UID: "08b91825-6eb2-4029-969f-97b8d045cfd3"). InnerVolumeSpecName "kube-api-access-x9644". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.433749 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9644\" (UniqueName: \"kubernetes.io/projected/08b91825-6eb2-4029-969f-97b8d045cfd3-kube-api-access-x9644\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.877059 4921 scope.go:117] "RemoveContainer" containerID="65d7dc68c785cfb8dbd858a92ba73968ae8a9c262d45508300c51b2dfc95d3e9" Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.939203 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.939181 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-mnsdl" event={"ID":"08b91825-6eb2-4029-969f-97b8d045cfd3","Type":"ContainerDied","Data":"45914f19d00b4ebbb505261701374fe99d3affb4403709c79368b0700abc2411"} Mar 18 12:46:04 crc kubenswrapper[4921]: I0318 12:46:04.939322 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45914f19d00b4ebbb505261701374fe99d3affb4403709c79368b0700abc2411" Mar 18 12:46:05 crc kubenswrapper[4921]: I0318 12:46:05.271565 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-snlp4"] Mar 18 12:46:05 crc kubenswrapper[4921]: I0318 12:46:05.283665 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-snlp4"] Mar 18 12:46:07 crc kubenswrapper[4921]: I0318 12:46:07.223843 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484733a9-30b5-4c90-b297-5cf18424e87f" path="/var/lib/kubelet/pods/484733a9-30b5-4c90-b297-5cf18424e87f/volumes" Mar 18 12:46:17 crc kubenswrapper[4921]: I0318 12:46:17.080998 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:46:17 crc kubenswrapper[4921]: I0318 12:46:17.081571 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.081097 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.081728 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.081774 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.082506 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.082577 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" gracePeriod=600 Mar 18 12:46:47 crc kubenswrapper[4921]: E0318 12:46:47.214157 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.242677 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" exitCode=0 Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.242745 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5"} Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.243037 4921 scope.go:117] "RemoveContainer" containerID="ab9dd9e85306f746850577c44545c0740315aa322f67909555228d828bc165e9" Mar 18 12:46:47 crc kubenswrapper[4921]: I0318 12:46:47.247906 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:46:47 crc kubenswrapper[4921]: E0318 12:46:47.248401 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:47:00 crc kubenswrapper[4921]: I0318 12:47:00.209404 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:47:00 crc kubenswrapper[4921]: E0318 12:47:00.210038 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:47:04 crc kubenswrapper[4921]: I0318 12:47:04.970260 4921 scope.go:117] "RemoveContainer" containerID="a11975923dd70f014b0bf24cb7ef0e4958d37fce2428a2372d7281c16e0be8b3" Mar 18 12:47:11 crc kubenswrapper[4921]: I0318 12:47:11.214206 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:47:11 crc kubenswrapper[4921]: E0318 12:47:11.214985 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:47:26 crc kubenswrapper[4921]: I0318 12:47:26.209323 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:47:26 crc kubenswrapper[4921]: E0318 12:47:26.210091 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:47:40 crc kubenswrapper[4921]: I0318 12:47:40.208816 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:47:40 crc kubenswrapper[4921]: E0318 12:47:40.209566 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:47:55 crc kubenswrapper[4921]: I0318 12:47:55.209593 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:47:55 crc kubenswrapper[4921]: E0318 12:47:55.210359 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.139415 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563968-gnhvk"] Mar 18 12:48:00 crc kubenswrapper[4921]: E0318 12:48:00.140132 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b91825-6eb2-4029-969f-97b8d045cfd3" containerName="oc" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.140151 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b91825-6eb2-4029-969f-97b8d045cfd3" containerName="oc" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.140288 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b91825-6eb2-4029-969f-97b8d045cfd3" containerName="oc" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.140834 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.143057 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.143422 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.144190 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.147022 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-gnhvk"] Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.251688 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv988\" (UniqueName: \"kubernetes.io/projected/287dc07f-b63c-4c74-b1d2-2f09bd172e3d-kube-api-access-dv988\") pod \"auto-csr-approver-29563968-gnhvk\" (UID: \"287dc07f-b63c-4c74-b1d2-2f09bd172e3d\") " pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.353831 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv988\" (UniqueName: \"kubernetes.io/projected/287dc07f-b63c-4c74-b1d2-2f09bd172e3d-kube-api-access-dv988\") pod \"auto-csr-approver-29563968-gnhvk\" (UID: \"287dc07f-b63c-4c74-b1d2-2f09bd172e3d\") " pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.372054 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv988\" (UniqueName: \"kubernetes.io/projected/287dc07f-b63c-4c74-b1d2-2f09bd172e3d-kube-api-access-dv988\") pod \"auto-csr-approver-29563968-gnhvk\" (UID: \"287dc07f-b63c-4c74-b1d2-2f09bd172e3d\") " pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.463516 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:00 crc kubenswrapper[4921]: I0318 12:48:00.940087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-gnhvk"] Mar 18 12:48:01 crc kubenswrapper[4921]: I0318 12:48:01.748151 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" event={"ID":"287dc07f-b63c-4c74-b1d2-2f09bd172e3d","Type":"ContainerStarted","Data":"9c2781d63d18ab9b9de935c19484d7999015307ae56071dc6f38abd86ea9181f"} Mar 18 12:48:02 crc kubenswrapper[4921]: I0318 12:48:02.756501 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" event={"ID":"287dc07f-b63c-4c74-b1d2-2f09bd172e3d","Type":"ContainerStarted","Data":"5908a7386a59ef31db7312bab3e273fda7d4889e816bc8306270bf5a1c6bc9d5"} Mar 18 12:48:02 crc kubenswrapper[4921]: I0318 12:48:02.772623 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" podStartSLOduration=1.6664334379999999 podStartE2EDuration="2.772604163s" podCreationTimestamp="2026-03-18 12:48:00 +0000 UTC" firstStartedPulling="2026-03-18 12:48:00.944417226 +0000 UTC m=+2300.494337885" lastFinishedPulling="2026-03-18 12:48:02.050587971 +0000 UTC m=+2301.600508610" observedRunningTime="2026-03-18 12:48:02.767992132 +0000 UTC m=+2302.317912771" watchObservedRunningTime="2026-03-18 12:48:02.772604163 +0000 UTC m=+2302.322524802" Mar 18 12:48:03 crc kubenswrapper[4921]: I0318 12:48:03.765461 4921 generic.go:334] "Generic (PLEG): container finished" podID="287dc07f-b63c-4c74-b1d2-2f09bd172e3d" containerID="5908a7386a59ef31db7312bab3e273fda7d4889e816bc8306270bf5a1c6bc9d5" exitCode=0 Mar 18 12:48:03 crc kubenswrapper[4921]: I0318 12:48:03.765535 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" event={"ID":"287dc07f-b63c-4c74-b1d2-2f09bd172e3d","Type":"ContainerDied","Data":"5908a7386a59ef31db7312bab3e273fda7d4889e816bc8306270bf5a1c6bc9d5"} Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.037887 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.124894 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv988\" (UniqueName: \"kubernetes.io/projected/287dc07f-b63c-4c74-b1d2-2f09bd172e3d-kube-api-access-dv988\") pod \"287dc07f-b63c-4c74-b1d2-2f09bd172e3d\" (UID: \"287dc07f-b63c-4c74-b1d2-2f09bd172e3d\") " Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.130277 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287dc07f-b63c-4c74-b1d2-2f09bd172e3d-kube-api-access-dv988" (OuterVolumeSpecName: "kube-api-access-dv988") pod "287dc07f-b63c-4c74-b1d2-2f09bd172e3d" (UID: "287dc07f-b63c-4c74-b1d2-2f09bd172e3d"). InnerVolumeSpecName "kube-api-access-dv988". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.226934 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv988\" (UniqueName: \"kubernetes.io/projected/287dc07f-b63c-4c74-b1d2-2f09bd172e3d-kube-api-access-dv988\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.785435 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" event={"ID":"287dc07f-b63c-4c74-b1d2-2f09bd172e3d","Type":"ContainerDied","Data":"9c2781d63d18ab9b9de935c19484d7999015307ae56071dc6f38abd86ea9181f"} Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.785483 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c2781d63d18ab9b9de935c19484d7999015307ae56071dc6f38abd86ea9181f" Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.785505 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-gnhvk" Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.833762 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-rvxsh"] Mar 18 12:48:05 crc kubenswrapper[4921]: I0318 12:48:05.839358 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-rvxsh"] Mar 18 12:48:06 crc kubenswrapper[4921]: I0318 12:48:06.209314 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:48:06 crc kubenswrapper[4921]: E0318 12:48:06.209654 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:48:07 crc kubenswrapper[4921]: I0318 12:48:07.216725 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef7c256-14a8-49b3-b3a0-d6817580141c" path="/var/lib/kubelet/pods/1ef7c256-14a8-49b3-b3a0-d6817580141c/volumes" Mar 18 12:48:19 crc kubenswrapper[4921]: I0318 12:48:19.208645 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:48:19 crc kubenswrapper[4921]: E0318 12:48:19.209408 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:48:30 crc kubenswrapper[4921]: I0318 12:48:30.209493 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:48:30 crc kubenswrapper[4921]: E0318 12:48:30.210252 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:48:45 crc kubenswrapper[4921]: I0318 12:48:45.209619 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:48:45 crc kubenswrapper[4921]: E0318 12:48:45.210469 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:49:00 crc kubenswrapper[4921]: I0318 12:49:00.209209 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:49:00 crc kubenswrapper[4921]: E0318 12:49:00.209861 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:49:05 crc kubenswrapper[4921]: I0318 12:49:05.045253 4921 scope.go:117] "RemoveContainer" containerID="8cbd3f99f290ed62aef30fa0945393e2eaac2000b989ab478144a662606e991b" Mar 18 12:49:12 crc kubenswrapper[4921]: I0318 12:49:12.209149 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:49:12 crc kubenswrapper[4921]: E0318 12:49:12.209735 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.493274 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djlpn"] Mar 18 12:49:19 crc kubenswrapper[4921]: E0318 12:49:19.494210 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287dc07f-b63c-4c74-b1d2-2f09bd172e3d" containerName="oc" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.494227 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="287dc07f-b63c-4c74-b1d2-2f09bd172e3d" containerName="oc" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.494415 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="287dc07f-b63c-4c74-b1d2-2f09bd172e3d" containerName="oc" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.495435 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.504044 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djlpn"] Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.512641 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-catalog-content\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.512695 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trjhl\" (UniqueName: \"kubernetes.io/projected/d9d5fb19-6d72-4f69-bab3-46554b8a4623-kube-api-access-trjhl\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.512806 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-utilities\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.614376 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-utilities\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.614498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-catalog-content\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.614535 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trjhl\" (UniqueName: \"kubernetes.io/projected/d9d5fb19-6d72-4f69-bab3-46554b8a4623-kube-api-access-trjhl\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.614883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-utilities\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.615296 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-catalog-content\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.635779 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trjhl\" (UniqueName: \"kubernetes.io/projected/d9d5fb19-6d72-4f69-bab3-46554b8a4623-kube-api-access-trjhl\") pod \"community-operators-djlpn\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:19 crc kubenswrapper[4921]: I0318 12:49:19.843437 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:20 crc kubenswrapper[4921]: I0318 12:49:20.318546 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djlpn"] Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.327995 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerID="908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63" exitCode=0 Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.328173 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerDied","Data":"908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63"} Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.328344 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerStarted","Data":"f3332a08f863910fa908bcd56e9e3e5fdbc19fefa0c77f7a5be756a8b5c51d4b"} Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.750154 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7gjcd"] Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.753724 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.768096 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gjcd"] Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.846168 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-catalog-content\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.846237 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sdc5\" (UniqueName: \"kubernetes.io/projected/1f9f06b4-05da-4727-a976-d297a8110c16-kube-api-access-5sdc5\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.846324 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-utilities\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.948268 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-catalog-content\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.948329 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sdc5\" (UniqueName: \"kubernetes.io/projected/1f9f06b4-05da-4727-a976-d297a8110c16-kube-api-access-5sdc5\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.948355 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-utilities\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.948995 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-catalog-content\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.949053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-utilities\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:21 crc kubenswrapper[4921]: I0318 12:49:21.975504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sdc5\" (UniqueName: \"kubernetes.io/projected/1f9f06b4-05da-4727-a976-d297a8110c16-kube-api-access-5sdc5\") pod \"redhat-marketplace-7gjcd\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:22 crc kubenswrapper[4921]: I0318 12:49:22.089591 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:22 crc kubenswrapper[4921]: I0318 12:49:22.343572 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerStarted","Data":"a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8"} Mar 18 12:49:22 crc kubenswrapper[4921]: I0318 12:49:22.366674 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gjcd"] Mar 18 12:49:22 crc kubenswrapper[4921]: W0318 12:49:22.376741 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9f06b4_05da_4727_a976_d297a8110c16.slice/crio-46e2326cc73779a3792519c1b6575ded7e948aa1909caacea6681ef826507ca7 WatchSource:0}: Error finding container 46e2326cc73779a3792519c1b6575ded7e948aa1909caacea6681ef826507ca7: Status 404 returned error can't find the container with id 46e2326cc73779a3792519c1b6575ded7e948aa1909caacea6681ef826507ca7 Mar 18 12:49:23 crc kubenswrapper[4921]: I0318 12:49:23.351925 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerID="a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8" exitCode=0 Mar 18 12:49:23 crc kubenswrapper[4921]: I0318 12:49:23.352057 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerDied","Data":"a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8"} Mar 18 12:49:23 crc kubenswrapper[4921]: I0318 12:49:23.353573 4921 generic.go:334] "Generic (PLEG): container finished" podID="1f9f06b4-05da-4727-a976-d297a8110c16" containerID="3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3" exitCode=0 Mar 18 12:49:23 crc kubenswrapper[4921]: I0318 12:49:23.353600 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gjcd" event={"ID":"1f9f06b4-05da-4727-a976-d297a8110c16","Type":"ContainerDied","Data":"3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3"} Mar 18 12:49:23 crc kubenswrapper[4921]: I0318 12:49:23.353619 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gjcd" event={"ID":"1f9f06b4-05da-4727-a976-d297a8110c16","Type":"ContainerStarted","Data":"46e2326cc73779a3792519c1b6575ded7e948aa1909caacea6681ef826507ca7"} Mar 18 12:49:24 crc kubenswrapper[4921]: I0318 12:49:24.364651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerStarted","Data":"872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa"} Mar 18 12:49:24 crc kubenswrapper[4921]: I0318 12:49:24.369673 4921 generic.go:334] "Generic (PLEG): container finished" podID="1f9f06b4-05da-4727-a976-d297a8110c16" containerID="772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955" exitCode=0 Mar 18 12:49:24 crc kubenswrapper[4921]: I0318 12:49:24.369773 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gjcd" event={"ID":"1f9f06b4-05da-4727-a976-d297a8110c16","Type":"ContainerDied","Data":"772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955"} Mar 18 12:49:24 crc kubenswrapper[4921]: I0318 12:49:24.388715 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djlpn" podStartSLOduration=2.975126425 podStartE2EDuration="5.388688073s" podCreationTimestamp="2026-03-18 12:49:19 +0000 UTC" firstStartedPulling="2026-03-18 12:49:21.331340796 +0000 UTC m=+2380.881261435" lastFinishedPulling="2026-03-18 12:49:23.744902454 +0000 UTC m=+2383.294823083" observedRunningTime="2026-03-18 12:49:24.38364807 +0000 UTC m=+2383.933568729" watchObservedRunningTime="2026-03-18 12:49:24.388688073 +0000 UTC m=+2383.938608712" Mar 18 12:49:25 crc kubenswrapper[4921]: I0318 12:49:25.379188 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gjcd" event={"ID":"1f9f06b4-05da-4727-a976-d297a8110c16","Type":"ContainerStarted","Data":"645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4"} Mar 18 12:49:25 crc kubenswrapper[4921]: I0318 12:49:25.402874 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7gjcd" podStartSLOduration=2.866015004 podStartE2EDuration="4.402844955s" podCreationTimestamp="2026-03-18 12:49:21 +0000 UTC" firstStartedPulling="2026-03-18 12:49:23.356215141 +0000 UTC m=+2382.906135780" lastFinishedPulling="2026-03-18 12:49:24.893045092 +0000 UTC m=+2384.442965731" observedRunningTime="2026-03-18 12:49:25.400621262 +0000 UTC m=+2384.950541901" watchObservedRunningTime="2026-03-18 12:49:25.402844955 +0000 UTC m=+2384.952765594" Mar 18 12:49:26 crc kubenswrapper[4921]: I0318 12:49:26.209844 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:49:26 crc kubenswrapper[4921]: E0318 12:49:26.210682 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:49:29 crc kubenswrapper[4921]: I0318 12:49:29.843907 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:29 crc kubenswrapper[4921]: I0318 12:49:29.844441 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:29 crc kubenswrapper[4921]: I0318 12:49:29.915599 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:30 crc kubenswrapper[4921]: I0318 12:49:30.451787 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:30 crc kubenswrapper[4921]: I0318 12:49:30.508057 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djlpn"] Mar 18 12:49:32 crc kubenswrapper[4921]: I0318 12:49:32.090184 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:32 crc kubenswrapper[4921]: I0318 12:49:32.090614 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:32 crc kubenswrapper[4921]: I0318 12:49:32.139981 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:32 crc kubenswrapper[4921]: I0318 12:49:32.421979 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djlpn" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="registry-server" containerID="cri-o://872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa" gracePeriod=2 Mar 18 12:49:32 crc kubenswrapper[4921]: I0318 12:49:32.472513 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.340502 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.432658 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerID="872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa" exitCode=0 Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.432714 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djlpn" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.432741 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerDied","Data":"872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa"} Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.432795 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djlpn" event={"ID":"d9d5fb19-6d72-4f69-bab3-46554b8a4623","Type":"ContainerDied","Data":"f3332a08f863910fa908bcd56e9e3e5fdbc19fefa0c77f7a5be756a8b5c51d4b"} Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.432813 4921 scope.go:117] "RemoveContainer" containerID="872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.451776 4921 scope.go:117] "RemoveContainer" containerID="a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.475085 4921 scope.go:117] "RemoveContainer" containerID="908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.497964 4921 scope.go:117] "RemoveContainer" containerID="872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa" Mar 18 12:49:33 crc kubenswrapper[4921]: E0318 12:49:33.498539 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa\": container with ID starting with 872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa not found: ID does not exist" containerID="872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.498596 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa"} err="failed to get container status \"872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa\": rpc error: code = NotFound desc = could not find container \"872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa\": container with ID starting with 872c0e650b96ea098944559fb7d9cccc24bc974d0c8fc225c373427b3a9fc7fa not found: ID does not exist" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.498639 4921 scope.go:117] "RemoveContainer" containerID="a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8" Mar 18 12:49:33 crc kubenswrapper[4921]: E0318 12:49:33.499586 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8\": container with ID starting with a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8 not found: ID does not exist" containerID="a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.499709 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8"} err="failed to get container status \"a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8\": rpc error: code = NotFound desc = could not find container \"a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8\": container with ID starting with a50041e070127ef45de42d1e1b12221e893791e78ab7fa16629d4036994fc8b8 not found: ID does not exist" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.499784 4921 scope.go:117] "RemoveContainer" containerID="908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63" Mar 18 12:49:33 crc kubenswrapper[4921]: E0318 12:49:33.500317 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63\": container with ID starting with 908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63 not found: ID does not exist" containerID="908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.500352 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63"} err="failed to get container status \"908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63\": rpc error: code = NotFound desc = could not find container \"908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63\": container with ID starting with 908f9292af5f84d51cff14110c98f9e230b0abc67134d5573ae12d2e8ea79f63 not found: ID does not exist" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.516955 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trjhl\" (UniqueName: \"kubernetes.io/projected/d9d5fb19-6d72-4f69-bab3-46554b8a4623-kube-api-access-trjhl\") pod \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.517034 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-utilities\") pod \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.517243 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-catalog-content\") pod \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\" (UID: \"d9d5fb19-6d72-4f69-bab3-46554b8a4623\") " Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.518550 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-utilities" (OuterVolumeSpecName: "utilities") pod "d9d5fb19-6d72-4f69-bab3-46554b8a4623" (UID: "d9d5fb19-6d72-4f69-bab3-46554b8a4623"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.523780 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d5fb19-6d72-4f69-bab3-46554b8a4623-kube-api-access-trjhl" (OuterVolumeSpecName: "kube-api-access-trjhl") pod "d9d5fb19-6d72-4f69-bab3-46554b8a4623" (UID: "d9d5fb19-6d72-4f69-bab3-46554b8a4623"). InnerVolumeSpecName "kube-api-access-trjhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.573202 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9d5fb19-6d72-4f69-bab3-46554b8a4623" (UID: "d9d5fb19-6d72-4f69-bab3-46554b8a4623"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.619939 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trjhl\" (UniqueName: \"kubernetes.io/projected/d9d5fb19-6d72-4f69-bab3-46554b8a4623-kube-api-access-trjhl\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.619975 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.619992 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d5fb19-6d72-4f69-bab3-46554b8a4623-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.767740 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djlpn"] Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.773381 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djlpn"] Mar 18 12:49:33 crc kubenswrapper[4921]: I0318 12:49:33.887511 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gjcd"] Mar 18 12:49:34 crc kubenswrapper[4921]: I0318 12:49:34.441646 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7gjcd" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="registry-server" containerID="cri-o://645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4" gracePeriod=2 Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.218180 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" path="/var/lib/kubelet/pods/d9d5fb19-6d72-4f69-bab3-46554b8a4623/volumes" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.347609 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.448526 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-catalog-content\") pod \"1f9f06b4-05da-4727-a976-d297a8110c16\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.448637 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-utilities\") pod \"1f9f06b4-05da-4727-a976-d297a8110c16\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.448676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sdc5\" (UniqueName: \"kubernetes.io/projected/1f9f06b4-05da-4727-a976-d297a8110c16-kube-api-access-5sdc5\") pod \"1f9f06b4-05da-4727-a976-d297a8110c16\" (UID: \"1f9f06b4-05da-4727-a976-d297a8110c16\") " Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.449861 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-utilities" (OuterVolumeSpecName: "utilities") pod "1f9f06b4-05da-4727-a976-d297a8110c16" (UID: "1f9f06b4-05da-4727-a976-d297a8110c16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.451655 4921 generic.go:334] "Generic (PLEG): container finished" podID="1f9f06b4-05da-4727-a976-d297a8110c16" containerID="645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4" exitCode=0 Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.451700 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gjcd" event={"ID":"1f9f06b4-05da-4727-a976-d297a8110c16","Type":"ContainerDied","Data":"645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4"} Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.451744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gjcd" event={"ID":"1f9f06b4-05da-4727-a976-d297a8110c16","Type":"ContainerDied","Data":"46e2326cc73779a3792519c1b6575ded7e948aa1909caacea6681ef826507ca7"} Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.451766 4921 scope.go:117] "RemoveContainer" containerID="645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.451814 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gjcd" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.455426 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9f06b4-05da-4727-a976-d297a8110c16-kube-api-access-5sdc5" (OuterVolumeSpecName: "kube-api-access-5sdc5") pod "1f9f06b4-05da-4727-a976-d297a8110c16" (UID: "1f9f06b4-05da-4727-a976-d297a8110c16"). InnerVolumeSpecName "kube-api-access-5sdc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.485368 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f9f06b4-05da-4727-a976-d297a8110c16" (UID: "1f9f06b4-05da-4727-a976-d297a8110c16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.487437 4921 scope.go:117] "RemoveContainer" containerID="772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.506292 4921 scope.go:117] "RemoveContainer" containerID="3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.526399 4921 scope.go:117] "RemoveContainer" containerID="645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4" Mar 18 12:49:35 crc kubenswrapper[4921]: E0318 12:49:35.526749 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4\": container with ID starting with 645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4 not found: ID does not exist" containerID="645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.526880 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4"} err="failed to get container status \"645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4\": rpc error: code = NotFound desc = could not find container \"645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4\": container with ID starting with 645ec4a971bf1061fd494c3e91129d8c02b88c7051d9b50563a6534d16eeb8a4 not found: ID does not exist" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.526909 4921 scope.go:117] "RemoveContainer" containerID="772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955" Mar 18 12:49:35 crc kubenswrapper[4921]: E0318 12:49:35.527201 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955\": container with ID starting with 772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955 not found: ID does not exist" containerID="772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.527224 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955"} err="failed to get container status \"772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955\": rpc error: code = NotFound desc = could not find container \"772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955\": container with ID starting with 772786a182a644e2121243c9e78e19721297da2c9c894235f057687bf81c6955 not found: ID does not exist" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.527237 4921 scope.go:117] "RemoveContainer" containerID="3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3" Mar 18 12:49:35 crc kubenswrapper[4921]: E0318 12:49:35.527446 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3\": container with ID starting with 3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3 not found: ID does not exist" containerID="3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.527465 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3"} err="failed to get container status \"3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3\": rpc error: code = NotFound desc = could not find container \"3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3\": container with ID starting with 3b4c29a09309b5e51336560a63231e4ffe70347dbf1a067f825adb476e57a9c3 not found: ID does not exist" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.549721 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.549761 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f9f06b4-05da-4727-a976-d297a8110c16-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.549785 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sdc5\" (UniqueName: \"kubernetes.io/projected/1f9f06b4-05da-4727-a976-d297a8110c16-kube-api-access-5sdc5\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.783492 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gjcd"] Mar 18 12:49:35 crc kubenswrapper[4921]: I0318 12:49:35.789215 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gjcd"] Mar 18 12:49:37 crc kubenswrapper[4921]: I0318 12:49:37.221643 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" path="/var/lib/kubelet/pods/1f9f06b4-05da-4727-a976-d297a8110c16/volumes" Mar 18 12:49:38 crc kubenswrapper[4921]: I0318 12:49:38.209781 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:49:38 crc kubenswrapper[4921]: E0318 12:49:38.209958 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:49:51 crc kubenswrapper[4921]: I0318 12:49:51.213099 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:49:51 crc kubenswrapper[4921]: E0318 12:49:51.214517 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.160544 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563970-7q6x2"] Mar 18 12:50:00 crc kubenswrapper[4921]: E0318 12:50:00.162275 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="extract-utilities" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.162352 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="extract-utilities" Mar 18 12:50:00 crc kubenswrapper[4921]: E0318 12:50:00.162414 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="extract-content" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.162473 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="extract-content" Mar 18 12:50:00 crc kubenswrapper[4921]: E0318 12:50:00.162539 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="extract-content" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.162595 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="extract-content" Mar 18 12:50:00 crc kubenswrapper[4921]: E0318 12:50:00.162655 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="extract-utilities" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.162714 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="extract-utilities" Mar 18 12:50:00 crc kubenswrapper[4921]: E0318 12:50:00.162769 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.162827 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4921]: E0318 12:50:00.162884 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.162955 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.163163 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9f06b4-05da-4727-a976-d297a8110c16" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.163229 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d5fb19-6d72-4f69-bab3-46554b8a4623" containerName="registry-server" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.163682 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.192956 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.193575 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.193836 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.207136 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-7q6x2"] Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.312432 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nkh\" (UniqueName: \"kubernetes.io/projected/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35-kube-api-access-t8nkh\") pod \"auto-csr-approver-29563970-7q6x2\" (UID: \"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35\") " pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.413671 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nkh\" (UniqueName: \"kubernetes.io/projected/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35-kube-api-access-t8nkh\") pod \"auto-csr-approver-29563970-7q6x2\" (UID: \"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35\") " pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.441372 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nkh\" (UniqueName: \"kubernetes.io/projected/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35-kube-api-access-t8nkh\") pod \"auto-csr-approver-29563970-7q6x2\" (UID: \"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35\") " pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.516428 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:00 crc kubenswrapper[4921]: I0318 12:50:00.935328 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-7q6x2"] Mar 18 12:50:01 crc kubenswrapper[4921]: I0318 12:50:01.663430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" event={"ID":"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35","Type":"ContainerStarted","Data":"d1e408406e40b319f43b8f60ec07750c8369e837cd43c23a8c742f11bee946aa"} Mar 18 12:50:03 crc kubenswrapper[4921]: I0318 12:50:03.683258 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35" containerID="6f1da6cfd02c0020938c5d7b3db54e16b1f230f6d9a3a5cbb712430104bc7f68" exitCode=0 Mar 18 12:50:03 crc kubenswrapper[4921]: I0318 12:50:03.683563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" event={"ID":"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35","Type":"ContainerDied","Data":"6f1da6cfd02c0020938c5d7b3db54e16b1f230f6d9a3a5cbb712430104bc7f68"} Mar 18 12:50:04 crc kubenswrapper[4921]: I0318 12:50:04.208914 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:50:04 crc kubenswrapper[4921]: E0318 12:50:04.209442 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.003682 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.183947 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8nkh\" (UniqueName: \"kubernetes.io/projected/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35-kube-api-access-t8nkh\") pod \"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35\" (UID: \"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35\") " Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.189620 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35-kube-api-access-t8nkh" (OuterVolumeSpecName: "kube-api-access-t8nkh") pod "bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35" (UID: "bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35"). InnerVolumeSpecName "kube-api-access-t8nkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.286400 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8nkh\" (UniqueName: \"kubernetes.io/projected/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35-kube-api-access-t8nkh\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.716753 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" event={"ID":"bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35","Type":"ContainerDied","Data":"d1e408406e40b319f43b8f60ec07750c8369e837cd43c23a8c742f11bee946aa"} Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.716805 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e408406e40b319f43b8f60ec07750c8369e837cd43c23a8c742f11bee946aa" Mar 18 12:50:05 crc kubenswrapper[4921]: I0318 12:50:05.716824 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-7q6x2" Mar 18 12:50:06 crc kubenswrapper[4921]: I0318 12:50:06.070167 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-wndmf"] Mar 18 12:50:06 crc kubenswrapper[4921]: I0318 12:50:06.081847 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-wndmf"] Mar 18 12:50:07 crc kubenswrapper[4921]: I0318 12:50:07.226051 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e194df3-e74a-4fd0-860b-39649533a25f" path="/var/lib/kubelet/pods/9e194df3-e74a-4fd0-860b-39649533a25f/volumes" Mar 18 12:50:19 crc kubenswrapper[4921]: I0318 12:50:19.209318 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:50:19 crc kubenswrapper[4921]: E0318 12:50:19.210146 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:50:33 crc kubenswrapper[4921]: I0318 12:50:33.208913 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:50:33 crc kubenswrapper[4921]: E0318 12:50:33.209519 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:50:45 crc kubenswrapper[4921]: I0318 12:50:45.208710 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:50:45 crc kubenswrapper[4921]: E0318 12:50:45.209367 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:50:56 crc kubenswrapper[4921]: I0318 12:50:56.209638 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:50:56 crc kubenswrapper[4921]: E0318 12:50:56.210337 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:51:05 crc kubenswrapper[4921]: I0318 12:51:05.167079 4921 scope.go:117] "RemoveContainer" containerID="83785af6effad1302d3e450280bf778f94dfc98dbd998c687294191289cd630b" Mar 18 12:51:08 crc kubenswrapper[4921]: I0318 12:51:08.208863 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:51:08 crc kubenswrapper[4921]: E0318 12:51:08.209479 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:51:20 crc kubenswrapper[4921]: I0318 12:51:20.209035 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:51:20 crc kubenswrapper[4921]: E0318 12:51:20.209662 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:51:35 crc kubenswrapper[4921]: I0318 12:51:35.209189 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:51:35 crc kubenswrapper[4921]: E0318 12:51:35.210500 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:51:48 crc kubenswrapper[4921]: I0318 12:51:48.209580 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:51:48 crc kubenswrapper[4921]: I0318 12:51:48.539361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3"} Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.152090 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563972-r8szk"] Mar 18 12:52:00 crc kubenswrapper[4921]: E0318 12:52:00.153001 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35" containerName="oc" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.153019 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35" containerName="oc" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.153206 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35" containerName="oc" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.153806 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.157997 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.158671 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.158705 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.161189 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-r8szk"] Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.333884 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769hd\" (UniqueName: \"kubernetes.io/projected/7fd523e5-9690-403e-9006-4463ac51d0b1-kube-api-access-769hd\") pod \"auto-csr-approver-29563972-r8szk\" (UID: \"7fd523e5-9690-403e-9006-4463ac51d0b1\") " pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.435785 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-769hd\" (UniqueName: \"kubernetes.io/projected/7fd523e5-9690-403e-9006-4463ac51d0b1-kube-api-access-769hd\") pod \"auto-csr-approver-29563972-r8szk\" (UID: \"7fd523e5-9690-403e-9006-4463ac51d0b1\") " pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.461643 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-769hd\" (UniqueName: \"kubernetes.io/projected/7fd523e5-9690-403e-9006-4463ac51d0b1-kube-api-access-769hd\") pod \"auto-csr-approver-29563972-r8szk\" (UID: \"7fd523e5-9690-403e-9006-4463ac51d0b1\") " pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.480802 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.947132 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-r8szk"] Mar 18 12:52:00 crc kubenswrapper[4921]: I0318 12:52:00.954591 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:52:01 crc kubenswrapper[4921]: I0318 12:52:01.628638 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-r8szk" event={"ID":"7fd523e5-9690-403e-9006-4463ac51d0b1","Type":"ContainerStarted","Data":"e111843e1ac15d090a641fef9e6d3016765ed5d555ebe4c6af6b5643dab04b5d"} Mar 18 12:52:03 crc kubenswrapper[4921]: I0318 12:52:03.646468 4921 generic.go:334] "Generic (PLEG): container finished" podID="7fd523e5-9690-403e-9006-4463ac51d0b1" containerID="828643624e7451c874b6fbdd0520771773993349be075039c5cbf03cdc016eeb" exitCode=0 Mar 18 12:52:03 crc kubenswrapper[4921]: I0318 12:52:03.646530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-r8szk" event={"ID":"7fd523e5-9690-403e-9006-4463ac51d0b1","Type":"ContainerDied","Data":"828643624e7451c874b6fbdd0520771773993349be075039c5cbf03cdc016eeb"} Mar 18 12:52:04 crc kubenswrapper[4921]: I0318 12:52:04.892695 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:04 crc kubenswrapper[4921]: I0318 12:52:04.998251 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769hd\" (UniqueName: \"kubernetes.io/projected/7fd523e5-9690-403e-9006-4463ac51d0b1-kube-api-access-769hd\") pod \"7fd523e5-9690-403e-9006-4463ac51d0b1\" (UID: \"7fd523e5-9690-403e-9006-4463ac51d0b1\") " Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.002861 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd523e5-9690-403e-9006-4463ac51d0b1-kube-api-access-769hd" (OuterVolumeSpecName: "kube-api-access-769hd") pod "7fd523e5-9690-403e-9006-4463ac51d0b1" (UID: "7fd523e5-9690-403e-9006-4463ac51d0b1"). InnerVolumeSpecName "kube-api-access-769hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.099893 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-769hd\" (UniqueName: \"kubernetes.io/projected/7fd523e5-9690-403e-9006-4463ac51d0b1-kube-api-access-769hd\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.665790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-r8szk" event={"ID":"7fd523e5-9690-403e-9006-4463ac51d0b1","Type":"ContainerDied","Data":"e111843e1ac15d090a641fef9e6d3016765ed5d555ebe4c6af6b5643dab04b5d"} Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.665835 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e111843e1ac15d090a641fef9e6d3016765ed5d555ebe4c6af6b5643dab04b5d" Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.665861 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-r8szk" Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.972266 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-mnsdl"] Mar 18 12:52:05 crc kubenswrapper[4921]: I0318 12:52:05.977817 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-mnsdl"] Mar 18 12:52:07 crc kubenswrapper[4921]: I0318 12:52:07.218178 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b91825-6eb2-4029-969f-97b8d045cfd3" path="/var/lib/kubelet/pods/08b91825-6eb2-4029-969f-97b8d045cfd3/volumes" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.650676 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kc8fn"] Mar 18 12:52:12 crc kubenswrapper[4921]: E0318 12:52:12.651886 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd523e5-9690-403e-9006-4463ac51d0b1" containerName="oc" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.651902 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd523e5-9690-403e-9006-4463ac51d0b1" containerName="oc" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.652067 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd523e5-9690-403e-9006-4463ac51d0b1" containerName="oc" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.653561 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.673356 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kc8fn"] Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.725783 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-catalog-content\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.726165 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-utilities\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.726396 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9nb7\" (UniqueName: \"kubernetes.io/projected/2c9b61aa-235c-42dc-a60f-9132831b02b3-kube-api-access-h9nb7\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.827956 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-catalog-content\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.828262 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-utilities\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.828363 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9nb7\" (UniqueName: \"kubernetes.io/projected/2c9b61aa-235c-42dc-a60f-9132831b02b3-kube-api-access-h9nb7\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.828511 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-catalog-content\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.828717 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-utilities\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.855099 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9nb7\" (UniqueName: \"kubernetes.io/projected/2c9b61aa-235c-42dc-a60f-9132831b02b3-kube-api-access-h9nb7\") pod \"redhat-operators-kc8fn\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:12 crc kubenswrapper[4921]: I0318 12:52:12.979144 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:13 crc kubenswrapper[4921]: I0318 12:52:13.241714 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kc8fn"] Mar 18 12:52:13 crc kubenswrapper[4921]: I0318 12:52:13.726486 4921 generic.go:334] "Generic (PLEG): container finished" podID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerID="97929e8a0ce6c0dfba5fe5239904b66f351352679bb49d982e1df4a835c2982c" exitCode=0 Mar 18 12:52:13 crc kubenswrapper[4921]: I0318 12:52:13.726591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerDied","Data":"97929e8a0ce6c0dfba5fe5239904b66f351352679bb49d982e1df4a835c2982c"} Mar 18 12:52:13 crc kubenswrapper[4921]: I0318 12:52:13.726766 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerStarted","Data":"6020f4d65ca02bc4e4ffa8c0edb3c6bb240bcc89c5e966ff557f237c33beca99"} Mar 18 12:52:14 crc kubenswrapper[4921]: I0318 12:52:14.737055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerStarted","Data":"d81570f2971af468258bda15f4ff06140d4b34666673149bee05a46a52d4a410"} Mar 18 12:52:15 crc kubenswrapper[4921]: I0318 12:52:15.745663 4921 generic.go:334] "Generic (PLEG): container finished" podID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerID="d81570f2971af468258bda15f4ff06140d4b34666673149bee05a46a52d4a410" exitCode=0 Mar 18 12:52:15 crc kubenswrapper[4921]: I0318 12:52:15.745726 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerDied","Data":"d81570f2971af468258bda15f4ff06140d4b34666673149bee05a46a52d4a410"} Mar 18 12:52:16 crc kubenswrapper[4921]: I0318 12:52:16.756611 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerStarted","Data":"7a61cfeb117634ad5fd892a41dd86685270e77d8bf87035f358a9f5b605d664b"} Mar 18 12:52:16 crc kubenswrapper[4921]: I0318 12:52:16.781084 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kc8fn" podStartSLOduration=2.3318780869999998 podStartE2EDuration="4.781064851s" podCreationTimestamp="2026-03-18 12:52:12 +0000 UTC" firstStartedPulling="2026-03-18 12:52:13.728661832 +0000 UTC m=+2553.278582471" lastFinishedPulling="2026-03-18 12:52:16.177848596 +0000 UTC m=+2555.727769235" observedRunningTime="2026-03-18 12:52:16.773737473 +0000 UTC m=+2556.323658142" watchObservedRunningTime="2026-03-18 12:52:16.781064851 +0000 UTC m=+2556.330985490" Mar 18 12:52:22 crc kubenswrapper[4921]: I0318 12:52:22.979837 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:22 crc kubenswrapper[4921]: I0318 12:52:22.980448 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:23 crc kubenswrapper[4921]: I0318 12:52:23.023675 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:23 crc kubenswrapper[4921]: I0318 12:52:23.848369 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:23 crc kubenswrapper[4921]: I0318 12:52:23.890060 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kc8fn"] Mar 18 12:52:25 crc kubenswrapper[4921]: I0318 12:52:25.828520 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kc8fn" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="registry-server" containerID="cri-o://7a61cfeb117634ad5fd892a41dd86685270e77d8bf87035f358a9f5b605d664b" gracePeriod=2 Mar 18 12:52:27 crc kubenswrapper[4921]: I0318 12:52:27.844065 4921 generic.go:334] "Generic (PLEG): container finished" podID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerID="7a61cfeb117634ad5fd892a41dd86685270e77d8bf87035f358a9f5b605d664b" exitCode=0 Mar 18 12:52:27 crc kubenswrapper[4921]: I0318 12:52:27.844143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerDied","Data":"7a61cfeb117634ad5fd892a41dd86685270e77d8bf87035f358a9f5b605d664b"} Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.046984 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.145517 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9nb7\" (UniqueName: \"kubernetes.io/projected/2c9b61aa-235c-42dc-a60f-9132831b02b3-kube-api-access-h9nb7\") pod \"2c9b61aa-235c-42dc-a60f-9132831b02b3\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.145621 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-catalog-content\") pod \"2c9b61aa-235c-42dc-a60f-9132831b02b3\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.145718 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-utilities\") pod \"2c9b61aa-235c-42dc-a60f-9132831b02b3\" (UID: \"2c9b61aa-235c-42dc-a60f-9132831b02b3\") " Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.146905 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-utilities" (OuterVolumeSpecName: "utilities") pod "2c9b61aa-235c-42dc-a60f-9132831b02b3" (UID: "2c9b61aa-235c-42dc-a60f-9132831b02b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.150429 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9b61aa-235c-42dc-a60f-9132831b02b3-kube-api-access-h9nb7" (OuterVolumeSpecName: "kube-api-access-h9nb7") pod "2c9b61aa-235c-42dc-a60f-9132831b02b3" (UID: "2c9b61aa-235c-42dc-a60f-9132831b02b3"). InnerVolumeSpecName "kube-api-access-h9nb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.247885 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9nb7\" (UniqueName: \"kubernetes.io/projected/2c9b61aa-235c-42dc-a60f-9132831b02b3-kube-api-access-h9nb7\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.248259 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.285790 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c9b61aa-235c-42dc-a60f-9132831b02b3" (UID: "2c9b61aa-235c-42dc-a60f-9132831b02b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.349701 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c9b61aa-235c-42dc-a60f-9132831b02b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.857391 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc8fn" event={"ID":"2c9b61aa-235c-42dc-a60f-9132831b02b3","Type":"ContainerDied","Data":"6020f4d65ca02bc4e4ffa8c0edb3c6bb240bcc89c5e966ff557f237c33beca99"} Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.857452 4921 scope.go:117] "RemoveContainer" containerID="7a61cfeb117634ad5fd892a41dd86685270e77d8bf87035f358a9f5b605d664b" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.857525 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc8fn" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.880954 4921 scope.go:117] "RemoveContainer" containerID="d81570f2971af468258bda15f4ff06140d4b34666673149bee05a46a52d4a410" Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.898831 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kc8fn"] Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.908070 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kc8fn"] Mar 18 12:52:28 crc kubenswrapper[4921]: I0318 12:52:28.928240 4921 scope.go:117] "RemoveContainer" containerID="97929e8a0ce6c0dfba5fe5239904b66f351352679bb49d982e1df4a835c2982c" Mar 18 12:52:29 crc kubenswrapper[4921]: I0318 12:52:29.222418 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" path="/var/lib/kubelet/pods/2c9b61aa-235c-42dc-a60f-9132831b02b3/volumes" Mar 18 12:53:05 crc kubenswrapper[4921]: I0318 12:53:05.264953 4921 scope.go:117] "RemoveContainer" containerID="7ef892b3ed6dd56721ee26374104acb5290bcdb706a30e36de71a1ce1430b113" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.146550 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563974-jvwct"] Mar 18 12:54:00 crc kubenswrapper[4921]: E0318 12:54:00.147436 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="extract-utilities" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.147449 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="extract-utilities" Mar 18 12:54:00 crc kubenswrapper[4921]: E0318 12:54:00.147467 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="registry-server" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.147473 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="registry-server" Mar 18 12:54:00 crc kubenswrapper[4921]: E0318 12:54:00.147485 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="extract-content" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.147491 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="extract-content" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.147623 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9b61aa-235c-42dc-a60f-9132831b02b3" containerName="registry-server" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.148049 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.154072 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.154166 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.155531 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.160751 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-jvwct"] Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.230892 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ssb\" (UniqueName: \"kubernetes.io/projected/85b19e3d-b80a-4456-a560-ba96c61ea6c2-kube-api-access-d2ssb\") pod \"auto-csr-approver-29563974-jvwct\" (UID: \"85b19e3d-b80a-4456-a560-ba96c61ea6c2\") " pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.333594 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ssb\" (UniqueName: \"kubernetes.io/projected/85b19e3d-b80a-4456-a560-ba96c61ea6c2-kube-api-access-d2ssb\") pod \"auto-csr-approver-29563974-jvwct\" (UID: \"85b19e3d-b80a-4456-a560-ba96c61ea6c2\") " pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.372304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ssb\" (UniqueName: \"kubernetes.io/projected/85b19e3d-b80a-4456-a560-ba96c61ea6c2-kube-api-access-d2ssb\") pod \"auto-csr-approver-29563974-jvwct\" (UID: \"85b19e3d-b80a-4456-a560-ba96c61ea6c2\") " pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.464553 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:00 crc kubenswrapper[4921]: I0318 12:54:00.891809 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-jvwct"] Mar 18 12:54:01 crc kubenswrapper[4921]: I0318 12:54:01.494690 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-jvwct" event={"ID":"85b19e3d-b80a-4456-a560-ba96c61ea6c2","Type":"ContainerStarted","Data":"c29a88619eb3226105db97cc0d4843e12c565cd417ab7cc2a8a62175757b2cda"} Mar 18 12:54:02 crc kubenswrapper[4921]: I0318 12:54:02.501362 4921 generic.go:334] "Generic (PLEG): container finished" podID="85b19e3d-b80a-4456-a560-ba96c61ea6c2" containerID="f0be9b609c6b7cc4dadd8c922a104b3e0c6790d23212c69978f6e8913ce75db1" exitCode=0 Mar 18 12:54:02 crc kubenswrapper[4921]: I0318 12:54:02.501430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-jvwct" event={"ID":"85b19e3d-b80a-4456-a560-ba96c61ea6c2","Type":"ContainerDied","Data":"f0be9b609c6b7cc4dadd8c922a104b3e0c6790d23212c69978f6e8913ce75db1"} Mar 18 12:54:03 crc kubenswrapper[4921]: I0318 12:54:03.765321 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:03 crc kubenswrapper[4921]: I0318 12:54:03.884449 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2ssb\" (UniqueName: \"kubernetes.io/projected/85b19e3d-b80a-4456-a560-ba96c61ea6c2-kube-api-access-d2ssb\") pod \"85b19e3d-b80a-4456-a560-ba96c61ea6c2\" (UID: \"85b19e3d-b80a-4456-a560-ba96c61ea6c2\") " Mar 18 12:54:03 crc kubenswrapper[4921]: I0318 12:54:03.889797 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b19e3d-b80a-4456-a560-ba96c61ea6c2-kube-api-access-d2ssb" (OuterVolumeSpecName: "kube-api-access-d2ssb") pod "85b19e3d-b80a-4456-a560-ba96c61ea6c2" (UID: "85b19e3d-b80a-4456-a560-ba96c61ea6c2"). InnerVolumeSpecName "kube-api-access-d2ssb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:04 crc kubenswrapper[4921]: I0318 12:54:04.013240 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2ssb\" (UniqueName: \"kubernetes.io/projected/85b19e3d-b80a-4456-a560-ba96c61ea6c2-kube-api-access-d2ssb\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:04 crc kubenswrapper[4921]: I0318 12:54:04.516422 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-jvwct" event={"ID":"85b19e3d-b80a-4456-a560-ba96c61ea6c2","Type":"ContainerDied","Data":"c29a88619eb3226105db97cc0d4843e12c565cd417ab7cc2a8a62175757b2cda"} Mar 18 12:54:04 crc kubenswrapper[4921]: I0318 12:54:04.516463 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-jvwct" Mar 18 12:54:04 crc kubenswrapper[4921]: I0318 12:54:04.516465 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29a88619eb3226105db97cc0d4843e12c565cd417ab7cc2a8a62175757b2cda" Mar 18 12:54:04 crc kubenswrapper[4921]: I0318 12:54:04.831360 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-gnhvk"] Mar 18 12:54:04 crc kubenswrapper[4921]: I0318 12:54:04.836802 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-gnhvk"] Mar 18 12:54:05 crc kubenswrapper[4921]: I0318 12:54:05.220478 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287dc07f-b63c-4c74-b1d2-2f09bd172e3d" path="/var/lib/kubelet/pods/287dc07f-b63c-4c74-b1d2-2f09bd172e3d/volumes" Mar 18 12:54:05 crc kubenswrapper[4921]: I0318 12:54:05.351181 4921 scope.go:117] "RemoveContainer" containerID="5908a7386a59ef31db7312bab3e273fda7d4889e816bc8306270bf5a1c6bc9d5" Mar 18 12:54:17 crc kubenswrapper[4921]: I0318 12:54:17.080934 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:54:17 crc kubenswrapper[4921]: I0318 12:54:17.081655 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:54:47 crc kubenswrapper[4921]: I0318 12:54:47.081323 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:54:47 crc kubenswrapper[4921]: I0318 12:54:47.081834 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:55:17 crc kubenswrapper[4921]: I0318 12:55:17.081770 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:55:17 crc kubenswrapper[4921]: I0318 12:55:17.082508 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:55:17 crc kubenswrapper[4921]: I0318 12:55:17.082557 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:55:17 crc kubenswrapper[4921]: I0318 12:55:17.083280 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:55:17 crc kubenswrapper[4921]: I0318 12:55:17.083335 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3" gracePeriod=600 Mar 18 12:55:17 crc kubenswrapper[4921]: E0318 12:55:17.315711 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509553d8_b894_456c_a45e_665e8497cdbc.slice/crio-conmon-3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:55:18 crc kubenswrapper[4921]: I0318 12:55:18.025518 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3" exitCode=0 Mar 18 12:55:18 crc kubenswrapper[4921]: I0318 12:55:18.025576 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3"} Mar 18 12:55:18 crc kubenswrapper[4921]: I0318 12:55:18.025873 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785"} Mar 18 12:55:18 crc kubenswrapper[4921]: I0318 12:55:18.025891 4921 scope.go:117] "RemoveContainer" containerID="d640dde2f6637cece09882542b58533baa94d41ad22669f6aeecd257b69342c5" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.147347 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563976-58slt"] Mar 18 12:56:00 crc kubenswrapper[4921]: E0318 12:56:00.148226 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b19e3d-b80a-4456-a560-ba96c61ea6c2" containerName="oc" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.148240 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b19e3d-b80a-4456-a560-ba96c61ea6c2" containerName="oc" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.148378 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b19e3d-b80a-4456-a560-ba96c61ea6c2" containerName="oc" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.148814 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.152810 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.152945 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.153662 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.157192 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-58slt"] Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.246292 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx75h\" (UniqueName: \"kubernetes.io/projected/8b106cf1-12b8-4ed6-80d2-d60adecf8463-kube-api-access-bx75h\") pod \"auto-csr-approver-29563976-58slt\" (UID: \"8b106cf1-12b8-4ed6-80d2-d60adecf8463\") " pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.348047 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx75h\" (UniqueName: \"kubernetes.io/projected/8b106cf1-12b8-4ed6-80d2-d60adecf8463-kube-api-access-bx75h\") pod \"auto-csr-approver-29563976-58slt\" (UID: \"8b106cf1-12b8-4ed6-80d2-d60adecf8463\") " pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.367808 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx75h\" (UniqueName: \"kubernetes.io/projected/8b106cf1-12b8-4ed6-80d2-d60adecf8463-kube-api-access-bx75h\") pod \"auto-csr-approver-29563976-58slt\" (UID: \"8b106cf1-12b8-4ed6-80d2-d60adecf8463\") " pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.475224 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:00 crc kubenswrapper[4921]: I0318 12:56:00.877050 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-58slt"] Mar 18 12:56:01 crc kubenswrapper[4921]: I0318 12:56:01.299938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-58slt" event={"ID":"8b106cf1-12b8-4ed6-80d2-d60adecf8463","Type":"ContainerStarted","Data":"70a7db4bbeb05f7d4e8c9d2bdbbb6a3a669db00db7382fdab070756c5acdc8ab"} Mar 18 12:56:03 crc kubenswrapper[4921]: I0318 12:56:03.316817 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-58slt" event={"ID":"8b106cf1-12b8-4ed6-80d2-d60adecf8463","Type":"ContainerStarted","Data":"c2926129032f1385a7e1aafb4f41d9dad0d5abab84fda6b99aa8ec375cf84e39"} Mar 18 12:56:03 crc kubenswrapper[4921]: I0318 12:56:03.337030 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563976-58slt" podStartSLOduration=1.418565445 podStartE2EDuration="3.336998814s" podCreationTimestamp="2026-03-18 12:56:00 +0000 UTC" firstStartedPulling="2026-03-18 12:56:00.885388851 +0000 UTC m=+2780.435309490" lastFinishedPulling="2026-03-18 12:56:02.80382222 +0000 UTC m=+2782.353742859" observedRunningTime="2026-03-18 12:56:03.331514499 +0000 UTC m=+2782.881435138" watchObservedRunningTime="2026-03-18 12:56:03.336998814 +0000 UTC m=+2782.886919453" Mar 18 12:56:04 crc kubenswrapper[4921]: I0318 12:56:04.390693 4921 generic.go:334] "Generic (PLEG): container finished" podID="8b106cf1-12b8-4ed6-80d2-d60adecf8463" containerID="c2926129032f1385a7e1aafb4f41d9dad0d5abab84fda6b99aa8ec375cf84e39" exitCode=0 Mar 18 12:56:04 crc kubenswrapper[4921]: I0318 12:56:04.390733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-58slt" event={"ID":"8b106cf1-12b8-4ed6-80d2-d60adecf8463","Type":"ContainerDied","Data":"c2926129032f1385a7e1aafb4f41d9dad0d5abab84fda6b99aa8ec375cf84e39"} Mar 18 12:56:05 crc kubenswrapper[4921]: I0318 12:56:05.670186 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:05 crc kubenswrapper[4921]: I0318 12:56:05.804170 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx75h\" (UniqueName: \"kubernetes.io/projected/8b106cf1-12b8-4ed6-80d2-d60adecf8463-kube-api-access-bx75h\") pod \"8b106cf1-12b8-4ed6-80d2-d60adecf8463\" (UID: \"8b106cf1-12b8-4ed6-80d2-d60adecf8463\") " Mar 18 12:56:05 crc kubenswrapper[4921]: I0318 12:56:05.811052 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b106cf1-12b8-4ed6-80d2-d60adecf8463-kube-api-access-bx75h" (OuterVolumeSpecName: "kube-api-access-bx75h") pod "8b106cf1-12b8-4ed6-80d2-d60adecf8463" (UID: "8b106cf1-12b8-4ed6-80d2-d60adecf8463"). InnerVolumeSpecName "kube-api-access-bx75h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:56:05 crc kubenswrapper[4921]: I0318 12:56:05.906063 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx75h\" (UniqueName: \"kubernetes.io/projected/8b106cf1-12b8-4ed6-80d2-d60adecf8463-kube-api-access-bx75h\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:06 crc kubenswrapper[4921]: I0318 12:56:06.393907 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-7q6x2"] Mar 18 12:56:06 crc kubenswrapper[4921]: I0318 12:56:06.399368 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-7q6x2"] Mar 18 12:56:06 crc kubenswrapper[4921]: I0318 12:56:06.404768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-58slt" event={"ID":"8b106cf1-12b8-4ed6-80d2-d60adecf8463","Type":"ContainerDied","Data":"70a7db4bbeb05f7d4e8c9d2bdbbb6a3a669db00db7382fdab070756c5acdc8ab"} Mar 18 12:56:06 crc kubenswrapper[4921]: I0318 12:56:06.404804 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70a7db4bbeb05f7d4e8c9d2bdbbb6a3a669db00db7382fdab070756c5acdc8ab" Mar 18 12:56:06 crc kubenswrapper[4921]: I0318 12:56:06.404829 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-58slt" Mar 18 12:56:07 crc kubenswrapper[4921]: I0318 12:56:07.217958 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35" path="/var/lib/kubelet/pods/bf00b5ee-7fa4-4ddb-bcbf-235be5a34d35/volumes" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.261574 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w42r8"] Mar 18 12:56:48 crc kubenswrapper[4921]: E0318 12:56:48.262635 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b106cf1-12b8-4ed6-80d2-d60adecf8463" containerName="oc" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.262657 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b106cf1-12b8-4ed6-80d2-d60adecf8463" containerName="oc" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.262911 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b106cf1-12b8-4ed6-80d2-d60adecf8463" containerName="oc" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.264289 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.268593 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w42r8"] Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.316480 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzhf\" (UniqueName: \"kubernetes.io/projected/d34fad76-2195-4106-a8b6-aa4bb5450016-kube-api-access-sdzhf\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.316550 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-utilities\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.316610 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-catalog-content\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.417621 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzhf\" (UniqueName: \"kubernetes.io/projected/d34fad76-2195-4106-a8b6-aa4bb5450016-kube-api-access-sdzhf\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.417928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-utilities\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.418092 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-catalog-content\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.418432 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-utilities\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.418542 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-catalog-content\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.434824 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzhf\" (UniqueName: \"kubernetes.io/projected/d34fad76-2195-4106-a8b6-aa4bb5450016-kube-api-access-sdzhf\") pod \"certified-operators-w42r8\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:48 crc kubenswrapper[4921]: I0318 12:56:48.585682 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:49 crc kubenswrapper[4921]: I0318 12:56:49.074002 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w42r8"] Mar 18 12:56:49 crc kubenswrapper[4921]: I0318 12:56:49.718503 4921 generic.go:334] "Generic (PLEG): container finished" podID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerID="4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6" exitCode=0 Mar 18 12:56:49 crc kubenswrapper[4921]: I0318 12:56:49.718678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerDied","Data":"4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6"} Mar 18 12:56:49 crc kubenswrapper[4921]: I0318 12:56:49.718897 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerStarted","Data":"99f769256bb67ed5253cfe7d8bb42ddb5942651daac26aa9ddf4b73b76cacc90"} Mar 18 12:56:50 crc kubenswrapper[4921]: I0318 12:56:50.728413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerStarted","Data":"6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96"} Mar 18 12:56:51 crc kubenswrapper[4921]: I0318 12:56:51.738548 4921 generic.go:334] "Generic (PLEG): container finished" podID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerID="6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96" exitCode=0 Mar 18 12:56:51 crc kubenswrapper[4921]: I0318 12:56:51.738630 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerDied","Data":"6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96"} Mar 18 12:56:52 crc kubenswrapper[4921]: I0318 12:56:52.750235 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerStarted","Data":"eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5"} Mar 18 12:56:52 crc kubenswrapper[4921]: I0318 12:56:52.771834 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w42r8" podStartSLOduration=2.299202038 podStartE2EDuration="4.771816408s" podCreationTimestamp="2026-03-18 12:56:48 +0000 UTC" firstStartedPulling="2026-03-18 12:56:49.720639294 +0000 UTC m=+2829.270559953" lastFinishedPulling="2026-03-18 12:56:52.193253664 +0000 UTC m=+2831.743174323" observedRunningTime="2026-03-18 12:56:52.768682299 +0000 UTC m=+2832.318602938" watchObservedRunningTime="2026-03-18 12:56:52.771816408 +0000 UTC m=+2832.321737047" Mar 18 12:56:58 crc kubenswrapper[4921]: I0318 12:56:58.586819 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:58 crc kubenswrapper[4921]: I0318 12:56:58.587176 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:58 crc kubenswrapper[4921]: I0318 12:56:58.630710 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:58 crc kubenswrapper[4921]: I0318 12:56:58.838066 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:56:58 crc kubenswrapper[4921]: I0318 12:56:58.876746 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w42r8"] Mar 18 12:57:00 crc kubenswrapper[4921]: I0318 12:57:00.811860 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w42r8" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="registry-server" containerID="cri-o://eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5" gracePeriod=2 Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.296279 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.417035 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-catalog-content\") pod \"d34fad76-2195-4106-a8b6-aa4bb5450016\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.417133 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-utilities\") pod \"d34fad76-2195-4106-a8b6-aa4bb5450016\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.417162 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdzhf\" (UniqueName: \"kubernetes.io/projected/d34fad76-2195-4106-a8b6-aa4bb5450016-kube-api-access-sdzhf\") pod \"d34fad76-2195-4106-a8b6-aa4bb5450016\" (UID: \"d34fad76-2195-4106-a8b6-aa4bb5450016\") " Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.418183 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-utilities" (OuterVolumeSpecName: "utilities") pod "d34fad76-2195-4106-a8b6-aa4bb5450016" (UID: "d34fad76-2195-4106-a8b6-aa4bb5450016"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.423036 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34fad76-2195-4106-a8b6-aa4bb5450016-kube-api-access-sdzhf" (OuterVolumeSpecName: "kube-api-access-sdzhf") pod "d34fad76-2195-4106-a8b6-aa4bb5450016" (UID: "d34fad76-2195-4106-a8b6-aa4bb5450016"). InnerVolumeSpecName "kube-api-access-sdzhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.473686 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d34fad76-2195-4106-a8b6-aa4bb5450016" (UID: "d34fad76-2195-4106-a8b6-aa4bb5450016"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.519221 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.519387 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d34fad76-2195-4106-a8b6-aa4bb5450016-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.519398 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdzhf\" (UniqueName: \"kubernetes.io/projected/d34fad76-2195-4106-a8b6-aa4bb5450016-kube-api-access-sdzhf\") on node \"crc\" DevicePath \"\"" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.826939 4921 generic.go:334] "Generic (PLEG): container finished" podID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerID="eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5" exitCode=0 Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.826996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerDied","Data":"eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5"} Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.827035 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w42r8" event={"ID":"d34fad76-2195-4106-a8b6-aa4bb5450016","Type":"ContainerDied","Data":"99f769256bb67ed5253cfe7d8bb42ddb5942651daac26aa9ddf4b73b76cacc90"} Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.827061 4921 scope.go:117] "RemoveContainer" containerID="eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.827093 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w42r8" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.866215 4921 scope.go:117] "RemoveContainer" containerID="6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.876472 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w42r8"] Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.888759 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w42r8"] Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.900804 4921 scope.go:117] "RemoveContainer" containerID="4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.922499 4921 scope.go:117] "RemoveContainer" containerID="eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5" Mar 18 12:57:01 crc kubenswrapper[4921]: E0318 12:57:01.925798 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5\": container with ID starting with eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5 not found: ID does not exist" containerID="eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.925848 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5"} err="failed to get container status \"eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5\": rpc error: code = NotFound desc = could not find container \"eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5\": container with ID starting with eda4f085b5d21363243e91b9b38bc810efcf9b158bd853c923577ea942daf8f5 not found: ID does not exist" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.925875 4921 scope.go:117] "RemoveContainer" containerID="6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96" Mar 18 12:57:01 crc kubenswrapper[4921]: E0318 12:57:01.927499 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96\": container with ID starting with 6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96 not found: ID does not exist" containerID="6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.927525 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96"} err="failed to get container status \"6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96\": rpc error: code = NotFound desc = could not find container \"6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96\": container with ID starting with 6e229ed9b6e7e07329b0c7aa35199837e4c42217cc2c111f0013959e8113be96 not found: ID does not exist" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.927540 4921 scope.go:117] "RemoveContainer" containerID="4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6" Mar 18 12:57:01 crc kubenswrapper[4921]: E0318 12:57:01.927920 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6\": container with ID starting with 4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6 not found: ID does not exist" containerID="4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6" Mar 18 12:57:01 crc kubenswrapper[4921]: I0318 12:57:01.927945 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6"} err="failed to get container status \"4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6\": rpc error: code = NotFound desc = could not find container \"4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6\": container with ID starting with 4500efa4cbc794914dabfd5276e8f2e3346c95f4f150c5438bf73dba5e22d0e6 not found: ID does not exist" Mar 18 12:57:03 crc kubenswrapper[4921]: I0318 12:57:03.219060 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" path="/var/lib/kubelet/pods/d34fad76-2195-4106-a8b6-aa4bb5450016/volumes" Mar 18 12:57:05 crc kubenswrapper[4921]: I0318 12:57:05.433437 4921 scope.go:117] "RemoveContainer" containerID="6f1da6cfd02c0020938c5d7b3db54e16b1f230f6d9a3a5cbb712430104bc7f68" Mar 18 12:57:17 crc kubenswrapper[4921]: I0318 12:57:17.080849 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:57:17 crc kubenswrapper[4921]: I0318 12:57:17.081504 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:57:47 crc kubenswrapper[4921]: I0318 12:57:47.081270 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:57:47 crc kubenswrapper[4921]: I0318 12:57:47.081881 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.148608 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563978-zmmbx"] Mar 18 12:58:00 crc kubenswrapper[4921]: E0318 12:58:00.153297 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="extract-content" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.153333 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="extract-content" Mar 18 12:58:00 crc kubenswrapper[4921]: E0318 12:58:00.153348 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="registry-server" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.153356 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="registry-server" Mar 18 12:58:00 crc kubenswrapper[4921]: E0318 12:58:00.153380 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="extract-utilities" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.153393 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="extract-utilities" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.153630 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34fad76-2195-4106-a8b6-aa4bb5450016" containerName="registry-server" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.154611 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.156979 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.157344 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.157761 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.160695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-zmmbx"] Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.277412 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfr6p\" (UniqueName: \"kubernetes.io/projected/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7-kube-api-access-gfr6p\") pod \"auto-csr-approver-29563978-zmmbx\" (UID: \"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7\") " pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.379895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfr6p\" (UniqueName: \"kubernetes.io/projected/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7-kube-api-access-gfr6p\") pod \"auto-csr-approver-29563978-zmmbx\" (UID: \"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7\") " pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.404066 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfr6p\" (UniqueName: \"kubernetes.io/projected/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7-kube-api-access-gfr6p\") pod \"auto-csr-approver-29563978-zmmbx\" (UID: \"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7\") " pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.484337 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.916095 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-zmmbx"] Mar 18 12:58:00 crc kubenswrapper[4921]: I0318 12:58:00.924893 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:58:01 crc kubenswrapper[4921]: I0318 12:58:01.268823 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" event={"ID":"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7","Type":"ContainerStarted","Data":"cfa80605c093d8df9925ce169074302c9e309bffecd608d262f58e4786920cbe"} Mar 18 12:58:02 crc kubenswrapper[4921]: I0318 12:58:02.282184 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" event={"ID":"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7","Type":"ContainerStarted","Data":"3cbc1e0884d96adee4b546de26bd329cc7cd75736ec203eb3a3b0cf84d58b3ac"} Mar 18 12:58:02 crc kubenswrapper[4921]: I0318 12:58:02.299244 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" podStartSLOduration=1.349987389 podStartE2EDuration="2.29922552s" podCreationTimestamp="2026-03-18 12:58:00 +0000 UTC" firstStartedPulling="2026-03-18 12:58:00.924683562 +0000 UTC m=+2900.474604201" lastFinishedPulling="2026-03-18 12:58:01.873921703 +0000 UTC m=+2901.423842332" observedRunningTime="2026-03-18 12:58:02.294950949 +0000 UTC m=+2901.844871608" watchObservedRunningTime="2026-03-18 12:58:02.29922552 +0000 UTC m=+2901.849146159" Mar 18 12:58:03 crc kubenswrapper[4921]: I0318 12:58:03.291808 4921 generic.go:334] "Generic (PLEG): container finished" podID="a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7" containerID="3cbc1e0884d96adee4b546de26bd329cc7cd75736ec203eb3a3b0cf84d58b3ac" exitCode=0 Mar 18 12:58:03 crc kubenswrapper[4921]: I0318 12:58:03.291909 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" event={"ID":"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7","Type":"ContainerDied","Data":"3cbc1e0884d96adee4b546de26bd329cc7cd75736ec203eb3a3b0cf84d58b3ac"} Mar 18 12:58:04 crc kubenswrapper[4921]: I0318 12:58:04.557064 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:04 crc kubenswrapper[4921]: I0318 12:58:04.671151 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfr6p\" (UniqueName: \"kubernetes.io/projected/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7-kube-api-access-gfr6p\") pod \"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7\" (UID: \"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7\") " Mar 18 12:58:04 crc kubenswrapper[4921]: I0318 12:58:04.679306 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7-kube-api-access-gfr6p" (OuterVolumeSpecName: "kube-api-access-gfr6p") pod "a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7" (UID: "a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7"). InnerVolumeSpecName "kube-api-access-gfr6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:58:04 crc kubenswrapper[4921]: I0318 12:58:04.773245 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfr6p\" (UniqueName: \"kubernetes.io/projected/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7-kube-api-access-gfr6p\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:05 crc kubenswrapper[4921]: I0318 12:58:05.307545 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" event={"ID":"a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7","Type":"ContainerDied","Data":"cfa80605c093d8df9925ce169074302c9e309bffecd608d262f58e4786920cbe"} Mar 18 12:58:05 crc kubenswrapper[4921]: I0318 12:58:05.307624 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfa80605c093d8df9925ce169074302c9e309bffecd608d262f58e4786920cbe" Mar 18 12:58:05 crc kubenswrapper[4921]: I0318 12:58:05.307669 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-zmmbx" Mar 18 12:58:05 crc kubenswrapper[4921]: I0318 12:58:05.373946 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-r8szk"] Mar 18 12:58:05 crc kubenswrapper[4921]: I0318 12:58:05.379504 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-r8szk"] Mar 18 12:58:07 crc kubenswrapper[4921]: I0318 12:58:07.220322 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd523e5-9690-403e-9006-4463ac51d0b1" path="/var/lib/kubelet/pods/7fd523e5-9690-403e-9006-4463ac51d0b1/volumes" Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.080993 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.081959 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.082025 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.083020 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.083098 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" gracePeriod=600 Mar 18 12:58:17 crc kubenswrapper[4921]: E0318 12:58:17.216012 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.415092 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" exitCode=0 Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.415154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785"} Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.415203 4921 scope.go:117] "RemoveContainer" containerID="3057d0ee1cab1f2dc4c0ee285bfc628c8181af952641f69651a778ab15a5c4d3" Mar 18 12:58:17 crc kubenswrapper[4921]: I0318 12:58:17.415737 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:58:17 crc kubenswrapper[4921]: E0318 12:58:17.416171 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:58:32 crc kubenswrapper[4921]: I0318 12:58:32.209249 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:58:32 crc kubenswrapper[4921]: E0318 12:58:32.209969 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:58:44 crc kubenswrapper[4921]: I0318 12:58:44.209138 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:58:44 crc kubenswrapper[4921]: E0318 12:58:44.209811 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:58:59 crc kubenswrapper[4921]: I0318 12:58:59.210124 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:58:59 crc kubenswrapper[4921]: E0318 12:58:59.210980 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:59:05 crc kubenswrapper[4921]: I0318 12:59:05.527228 4921 scope.go:117] "RemoveContainer" containerID="828643624e7451c874b6fbdd0520771773993349be075039c5cbf03cdc016eeb" Mar 18 12:59:11 crc kubenswrapper[4921]: I0318 12:59:11.218146 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:59:11 crc kubenswrapper[4921]: E0318 12:59:11.219495 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.823375 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pwsds"] Mar 18 12:59:20 crc kubenswrapper[4921]: E0318 12:59:20.824228 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7" containerName="oc" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.824239 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7" containerName="oc" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.824387 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7" containerName="oc" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.826973 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.840610 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwsds"] Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.903637 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-utilities\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.903756 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-catalog-content\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:20 crc kubenswrapper[4921]: I0318 12:59:20.903834 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k46hj\" (UniqueName: \"kubernetes.io/projected/b76cdce1-2733-4d9d-9152-93bd32718546-kube-api-access-k46hj\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.005161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-utilities\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.005259 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-catalog-content\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.005307 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k46hj\" (UniqueName: \"kubernetes.io/projected/b76cdce1-2733-4d9d-9152-93bd32718546-kube-api-access-k46hj\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.006179 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-catalog-content\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.006664 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-utilities\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.027820 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k46hj\" (UniqueName: \"kubernetes.io/projected/b76cdce1-2733-4d9d-9152-93bd32718546-kube-api-access-k46hj\") pod \"community-operators-pwsds\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.143530 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.639354 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwsds"] Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.879920 4921 generic.go:334] "Generic (PLEG): container finished" podID="b76cdce1-2733-4d9d-9152-93bd32718546" containerID="aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137" exitCode=0 Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.880024 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerDied","Data":"aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137"} Mar 18 12:59:21 crc kubenswrapper[4921]: I0318 12:59:21.881367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerStarted","Data":"d897a1f29b76ecf138ce08069ec25d389a3137074705ab5037590c2c63738368"} Mar 18 12:59:22 crc kubenswrapper[4921]: I0318 12:59:22.889208 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerStarted","Data":"cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339"} Mar 18 12:59:23 crc kubenswrapper[4921]: I0318 12:59:23.897000 4921 generic.go:334] "Generic (PLEG): container finished" podID="b76cdce1-2733-4d9d-9152-93bd32718546" containerID="cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339" exitCode=0 Mar 18 12:59:23 crc kubenswrapper[4921]: I0318 12:59:23.897073 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerDied","Data":"cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339"} Mar 18 12:59:24 crc kubenswrapper[4921]: I0318 12:59:24.209734 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:59:24 crc kubenswrapper[4921]: E0318 12:59:24.210195 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:59:24 crc kubenswrapper[4921]: I0318 12:59:24.906066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerStarted","Data":"50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669"} Mar 18 12:59:24 crc kubenswrapper[4921]: I0318 12:59:24.927948 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pwsds" podStartSLOduration=2.28879627 podStartE2EDuration="4.92792942s" podCreationTimestamp="2026-03-18 12:59:20 +0000 UTC" firstStartedPulling="2026-03-18 12:59:21.881254505 +0000 UTC m=+2981.431175144" lastFinishedPulling="2026-03-18 12:59:24.520387645 +0000 UTC m=+2984.070308294" observedRunningTime="2026-03-18 12:59:24.920690104 +0000 UTC m=+2984.470610743" watchObservedRunningTime="2026-03-18 12:59:24.92792942 +0000 UTC m=+2984.477850069" Mar 18 12:59:31 crc kubenswrapper[4921]: I0318 12:59:31.144421 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:31 crc kubenswrapper[4921]: I0318 12:59:31.145018 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:31 crc kubenswrapper[4921]: I0318 12:59:31.193717 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:31 crc kubenswrapper[4921]: I0318 12:59:31.996998 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:32 crc kubenswrapper[4921]: I0318 12:59:32.044463 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwsds"] Mar 18 12:59:33 crc kubenswrapper[4921]: I0318 12:59:33.971463 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pwsds" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="registry-server" containerID="cri-o://50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669" gracePeriod=2 Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.333856 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.439420 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-utilities\") pod \"b76cdce1-2733-4d9d-9152-93bd32718546\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.439499 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-catalog-content\") pod \"b76cdce1-2733-4d9d-9152-93bd32718546\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.439537 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k46hj\" (UniqueName: \"kubernetes.io/projected/b76cdce1-2733-4d9d-9152-93bd32718546-kube-api-access-k46hj\") pod \"b76cdce1-2733-4d9d-9152-93bd32718546\" (UID: \"b76cdce1-2733-4d9d-9152-93bd32718546\") " Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.440653 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-utilities" (OuterVolumeSpecName: "utilities") pod "b76cdce1-2733-4d9d-9152-93bd32718546" (UID: "b76cdce1-2733-4d9d-9152-93bd32718546"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.445793 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76cdce1-2733-4d9d-9152-93bd32718546-kube-api-access-k46hj" (OuterVolumeSpecName: "kube-api-access-k46hj") pod "b76cdce1-2733-4d9d-9152-93bd32718546" (UID: "b76cdce1-2733-4d9d-9152-93bd32718546"). InnerVolumeSpecName "kube-api-access-k46hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.491650 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b76cdce1-2733-4d9d-9152-93bd32718546" (UID: "b76cdce1-2733-4d9d-9152-93bd32718546"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.541141 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.541191 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k46hj\" (UniqueName: \"kubernetes.io/projected/b76cdce1-2733-4d9d-9152-93bd32718546-kube-api-access-k46hj\") on node \"crc\" DevicePath \"\"" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.541213 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76cdce1-2733-4d9d-9152-93bd32718546-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.979896 4921 generic.go:334] "Generic (PLEG): container finished" podID="b76cdce1-2733-4d9d-9152-93bd32718546" containerID="50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669" exitCode=0 Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.979946 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerDied","Data":"50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669"} Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.979983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwsds" event={"ID":"b76cdce1-2733-4d9d-9152-93bd32718546","Type":"ContainerDied","Data":"d897a1f29b76ecf138ce08069ec25d389a3137074705ab5037590c2c63738368"} Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.980003 4921 scope.go:117] "RemoveContainer" containerID="50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669" Mar 18 12:59:34 crc kubenswrapper[4921]: I0318 12:59:34.980018 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwsds" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.011347 4921 scope.go:117] "RemoveContainer" containerID="cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.014504 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwsds"] Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.029468 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pwsds"] Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.032006 4921 scope.go:117] "RemoveContainer" containerID="aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.074218 4921 scope.go:117] "RemoveContainer" containerID="50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669" Mar 18 12:59:35 crc kubenswrapper[4921]: E0318 12:59:35.074712 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669\": container with ID starting with 50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669 not found: ID does not exist" containerID="50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.074754 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669"} err="failed to get container status \"50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669\": rpc error: code = NotFound desc = could not find container \"50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669\": container with ID starting with 50fc0942aca08ef681a322b750d83ada3757e3894db85ba1dc11be78a23e8669 not found: ID does not exist" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.074779 4921 scope.go:117] "RemoveContainer" containerID="cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339" Mar 18 12:59:35 crc kubenswrapper[4921]: E0318 12:59:35.075578 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339\": container with ID starting with cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339 not found: ID does not exist" containerID="cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.075607 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339"} err="failed to get container status \"cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339\": rpc error: code = NotFound desc = could not find container \"cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339\": container with ID starting with cf3c28158c309ac3a831bfa06d8194713545432baa4799fc61f131e5c578f339 not found: ID does not exist" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.075621 4921 scope.go:117] "RemoveContainer" containerID="aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137" Mar 18 12:59:35 crc kubenswrapper[4921]: E0318 12:59:35.076075 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137\": container with ID starting with aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137 not found: ID does not exist" containerID="aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.076096 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137"} err="failed to get container status \"aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137\": rpc error: code = NotFound desc = could not find container \"aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137\": container with ID starting with aa176f807f3a9b70a7360fa4c4c87dbeba4cf444b0353d2724139f1b3df40137 not found: ID does not exist" Mar 18 12:59:35 crc kubenswrapper[4921]: I0318 12:59:35.217815 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" path="/var/lib/kubelet/pods/b76cdce1-2733-4d9d-9152-93bd32718546/volumes" Mar 18 12:59:39 crc kubenswrapper[4921]: I0318 12:59:39.209905 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:59:39 crc kubenswrapper[4921]: E0318 12:59:39.211362 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 12:59:52 crc kubenswrapper[4921]: I0318 12:59:52.209055 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 12:59:52 crc kubenswrapper[4921]: E0318 12:59:52.209735 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.170501 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563980-jp58n"] Mar 18 13:00:00 crc kubenswrapper[4921]: E0318 13:00:00.171560 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="registry-server" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.171584 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="registry-server" Mar 18 13:00:00 crc kubenswrapper[4921]: E0318 13:00:00.171615 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="extract-utilities" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.171627 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="extract-utilities" Mar 18 13:00:00 crc kubenswrapper[4921]: E0318 13:00:00.171640 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="extract-content" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.171652 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="extract-content" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.171893 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76cdce1-2733-4d9d-9152-93bd32718546" containerName="registry-server" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.172710 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.174790 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.175412 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.176685 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.179574 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf"] Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.180581 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.182315 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.183371 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.194984 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-jp58n"] Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.202218 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf"] Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.243918 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5xlr\" (UniqueName: \"kubernetes.io/projected/e7e2501f-5413-44f3-beba-c05c5cd108ab-kube-api-access-m5xlr\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.244060 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e2501f-5413-44f3-beba-c05c5cd108ab-config-volume\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.244153 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbnf9\" (UniqueName: \"kubernetes.io/projected/2ebf67d9-23d4-42bc-8493-5558214b76b3-kube-api-access-qbnf9\") pod \"auto-csr-approver-29563980-jp58n\" (UID: \"2ebf67d9-23d4-42bc-8493-5558214b76b3\") " pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.244267 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e2501f-5413-44f3-beba-c05c5cd108ab-secret-volume\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.345933 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5xlr\" (UniqueName: \"kubernetes.io/projected/e7e2501f-5413-44f3-beba-c05c5cd108ab-kube-api-access-m5xlr\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.346011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e2501f-5413-44f3-beba-c05c5cd108ab-config-volume\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.346045 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbnf9\" (UniqueName: \"kubernetes.io/projected/2ebf67d9-23d4-42bc-8493-5558214b76b3-kube-api-access-qbnf9\") pod \"auto-csr-approver-29563980-jp58n\" (UID: \"2ebf67d9-23d4-42bc-8493-5558214b76b3\") " pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.346174 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e2501f-5413-44f3-beba-c05c5cd108ab-secret-volume\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.347806 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e2501f-5413-44f3-beba-c05c5cd108ab-config-volume\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.358758 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e2501f-5413-44f3-beba-c05c5cd108ab-secret-volume\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.363101 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5xlr\" (UniqueName: \"kubernetes.io/projected/e7e2501f-5413-44f3-beba-c05c5cd108ab-kube-api-access-m5xlr\") pod \"collect-profiles-29563980-wgngf\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.364492 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbnf9\" (UniqueName: \"kubernetes.io/projected/2ebf67d9-23d4-42bc-8493-5558214b76b3-kube-api-access-qbnf9\") pod \"auto-csr-approver-29563980-jp58n\" (UID: \"2ebf67d9-23d4-42bc-8493-5558214b76b3\") " pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.494082 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.504544 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.941841 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-jp58n"] Mar 18 13:00:00 crc kubenswrapper[4921]: I0318 13:00:00.992852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf"] Mar 18 13:00:00 crc kubenswrapper[4921]: W0318 13:00:00.995825 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e2501f_5413_44f3_beba_c05c5cd108ab.slice/crio-2e146c4e94399ce4383e110050634f9792b47e413dad2bc82295a442b093f0e2 WatchSource:0}: Error finding container 2e146c4e94399ce4383e110050634f9792b47e413dad2bc82295a442b093f0e2: Status 404 returned error can't find the container with id 2e146c4e94399ce4383e110050634f9792b47e413dad2bc82295a442b093f0e2 Mar 18 13:00:01 crc kubenswrapper[4921]: I0318 13:00:01.207875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-jp58n" event={"ID":"2ebf67d9-23d4-42bc-8493-5558214b76b3","Type":"ContainerStarted","Data":"98dcd2c6cbf688b9c39b9fc93b80bcaf4fde8817c4db1c0b65d4b0ef31555d45"} Mar 18 13:00:01 crc kubenswrapper[4921]: I0318 13:00:01.220926 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" event={"ID":"e7e2501f-5413-44f3-beba-c05c5cd108ab","Type":"ContainerStarted","Data":"32ca28282b9917bb375c16f381c375a9ba77e40bafe6ccce322a0bb6a6a86e75"} Mar 18 13:00:01 crc kubenswrapper[4921]: I0318 13:00:01.220988 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" event={"ID":"e7e2501f-5413-44f3-beba-c05c5cd108ab","Type":"ContainerStarted","Data":"2e146c4e94399ce4383e110050634f9792b47e413dad2bc82295a442b093f0e2"} Mar 18 13:00:01 crc kubenswrapper[4921]: I0318 13:00:01.278434 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" podStartSLOduration=1.278413027 podStartE2EDuration="1.278413027s" podCreationTimestamp="2026-03-18 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:00:01.273504828 +0000 UTC m=+3020.823425467" watchObservedRunningTime="2026-03-18 13:00:01.278413027 +0000 UTC m=+3020.828333666" Mar 18 13:00:02 crc kubenswrapper[4921]: I0318 13:00:02.220285 4921 generic.go:334] "Generic (PLEG): container finished" podID="e7e2501f-5413-44f3-beba-c05c5cd108ab" containerID="32ca28282b9917bb375c16f381c375a9ba77e40bafe6ccce322a0bb6a6a86e75" exitCode=0 Mar 18 13:00:02 crc kubenswrapper[4921]: I0318 13:00:02.220373 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" event={"ID":"e7e2501f-5413-44f3-beba-c05c5cd108ab","Type":"ContainerDied","Data":"32ca28282b9917bb375c16f381c375a9ba77e40bafe6ccce322a0bb6a6a86e75"} Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.511246 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.699026 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5xlr\" (UniqueName: \"kubernetes.io/projected/e7e2501f-5413-44f3-beba-c05c5cd108ab-kube-api-access-m5xlr\") pod \"e7e2501f-5413-44f3-beba-c05c5cd108ab\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.699176 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e2501f-5413-44f3-beba-c05c5cd108ab-secret-volume\") pod \"e7e2501f-5413-44f3-beba-c05c5cd108ab\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.699229 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e2501f-5413-44f3-beba-c05c5cd108ab-config-volume\") pod \"e7e2501f-5413-44f3-beba-c05c5cd108ab\" (UID: \"e7e2501f-5413-44f3-beba-c05c5cd108ab\") " Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.700275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e2501f-5413-44f3-beba-c05c5cd108ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7e2501f-5413-44f3-beba-c05c5cd108ab" (UID: "e7e2501f-5413-44f3-beba-c05c5cd108ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.705323 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e2501f-5413-44f3-beba-c05c5cd108ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7e2501f-5413-44f3-beba-c05c5cd108ab" (UID: "e7e2501f-5413-44f3-beba-c05c5cd108ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.709779 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e2501f-5413-44f3-beba-c05c5cd108ab-kube-api-access-m5xlr" (OuterVolumeSpecName: "kube-api-access-m5xlr") pod "e7e2501f-5413-44f3-beba-c05c5cd108ab" (UID: "e7e2501f-5413-44f3-beba-c05c5cd108ab"). InnerVolumeSpecName "kube-api-access-m5xlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.800496 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7e2501f-5413-44f3-beba-c05c5cd108ab-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.800531 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7e2501f-5413-44f3-beba-c05c5cd108ab-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:03 crc kubenswrapper[4921]: I0318 13:00:03.800543 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5xlr\" (UniqueName: \"kubernetes.io/projected/e7e2501f-5413-44f3-beba-c05c5cd108ab-kube-api-access-m5xlr\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:04 crc kubenswrapper[4921]: I0318 13:00:04.209310 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:00:04 crc kubenswrapper[4921]: E0318 13:00:04.209662 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:00:04 crc kubenswrapper[4921]: I0318 13:00:04.242366 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" Mar 18 13:00:04 crc kubenswrapper[4921]: I0318 13:00:04.242358 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf" event={"ID":"e7e2501f-5413-44f3-beba-c05c5cd108ab","Type":"ContainerDied","Data":"2e146c4e94399ce4383e110050634f9792b47e413dad2bc82295a442b093f0e2"} Mar 18 13:00:04 crc kubenswrapper[4921]: I0318 13:00:04.242490 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e146c4e94399ce4383e110050634f9792b47e413dad2bc82295a442b093f0e2" Mar 18 13:00:04 crc kubenswrapper[4921]: I0318 13:00:04.312389 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8"] Mar 18 13:00:04 crc kubenswrapper[4921]: I0318 13:00:04.317562 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-slzc8"] Mar 18 13:00:05 crc kubenswrapper[4921]: I0318 13:00:05.219624 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56299bf-8e89-4425-914d-1aa2f0e635af" path="/var/lib/kubelet/pods/c56299bf-8e89-4425-914d-1aa2f0e635af/volumes" Mar 18 13:00:05 crc kubenswrapper[4921]: I0318 13:00:05.600337 4921 scope.go:117] "RemoveContainer" containerID="9fcf4697e2c7d18c089b8035cad2591e0554412a34d6f3af359fff72d5505a16" Mar 18 13:00:08 crc kubenswrapper[4921]: I0318 13:00:08.275055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-jp58n" event={"ID":"2ebf67d9-23d4-42bc-8493-5558214b76b3","Type":"ContainerStarted","Data":"9a3389b29836630762ba276e55c2b3679926584590a4669e52a112dfd31bc939"} Mar 18 13:00:08 crc kubenswrapper[4921]: I0318 13:00:08.293061 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563980-jp58n" podStartSLOduration=1.319925976 podStartE2EDuration="8.293045193s" podCreationTimestamp="2026-03-18 13:00:00 +0000 UTC" firstStartedPulling="2026-03-18 13:00:00.949101692 +0000 UTC m=+3020.499022331" lastFinishedPulling="2026-03-18 13:00:07.922220909 +0000 UTC m=+3027.472141548" observedRunningTime="2026-03-18 13:00:08.293011412 +0000 UTC m=+3027.842932071" watchObservedRunningTime="2026-03-18 13:00:08.293045193 +0000 UTC m=+3027.842965832" Mar 18 13:00:09 crc kubenswrapper[4921]: I0318 13:00:09.282434 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ebf67d9-23d4-42bc-8493-5558214b76b3" containerID="9a3389b29836630762ba276e55c2b3679926584590a4669e52a112dfd31bc939" exitCode=0 Mar 18 13:00:09 crc kubenswrapper[4921]: I0318 13:00:09.282469 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-jp58n" event={"ID":"2ebf67d9-23d4-42bc-8493-5558214b76b3","Type":"ContainerDied","Data":"9a3389b29836630762ba276e55c2b3679926584590a4669e52a112dfd31bc939"} Mar 18 13:00:10 crc kubenswrapper[4921]: I0318 13:00:10.548081 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:10 crc kubenswrapper[4921]: I0318 13:00:10.697938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbnf9\" (UniqueName: \"kubernetes.io/projected/2ebf67d9-23d4-42bc-8493-5558214b76b3-kube-api-access-qbnf9\") pod \"2ebf67d9-23d4-42bc-8493-5558214b76b3\" (UID: \"2ebf67d9-23d4-42bc-8493-5558214b76b3\") " Mar 18 13:00:10 crc kubenswrapper[4921]: I0318 13:00:10.703652 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ebf67d9-23d4-42bc-8493-5558214b76b3-kube-api-access-qbnf9" (OuterVolumeSpecName: "kube-api-access-qbnf9") pod "2ebf67d9-23d4-42bc-8493-5558214b76b3" (UID: "2ebf67d9-23d4-42bc-8493-5558214b76b3"). InnerVolumeSpecName "kube-api-access-qbnf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:00:10 crc kubenswrapper[4921]: I0318 13:00:10.799364 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbnf9\" (UniqueName: \"kubernetes.io/projected/2ebf67d9-23d4-42bc-8493-5558214b76b3-kube-api-access-qbnf9\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:11 crc kubenswrapper[4921]: I0318 13:00:11.299465 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-jp58n" event={"ID":"2ebf67d9-23d4-42bc-8493-5558214b76b3","Type":"ContainerDied","Data":"98dcd2c6cbf688b9c39b9fc93b80bcaf4fde8817c4db1c0b65d4b0ef31555d45"} Mar 18 13:00:11 crc kubenswrapper[4921]: I0318 13:00:11.299530 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98dcd2c6cbf688b9c39b9fc93b80bcaf4fde8817c4db1c0b65d4b0ef31555d45" Mar 18 13:00:11 crc kubenswrapper[4921]: I0318 13:00:11.299611 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-jp58n" Mar 18 13:00:11 crc kubenswrapper[4921]: I0318 13:00:11.347534 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-jvwct"] Mar 18 13:00:11 crc kubenswrapper[4921]: I0318 13:00:11.352780 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-jvwct"] Mar 18 13:00:13 crc kubenswrapper[4921]: I0318 13:00:13.223967 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b19e3d-b80a-4456-a560-ba96c61ea6c2" path="/var/lib/kubelet/pods/85b19e3d-b80a-4456-a560-ba96c61ea6c2/volumes" Mar 18 13:00:16 crc kubenswrapper[4921]: I0318 13:00:16.209655 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:00:16 crc kubenswrapper[4921]: E0318 13:00:16.210204 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:00:31 crc kubenswrapper[4921]: I0318 13:00:31.214200 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:00:31 crc kubenswrapper[4921]: E0318 13:00:31.215271 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:00:44 crc kubenswrapper[4921]: I0318 13:00:44.209308 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:00:44 crc kubenswrapper[4921]: E0318 13:00:44.209907 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:00:58 crc kubenswrapper[4921]: I0318 13:00:58.210312 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:00:58 crc kubenswrapper[4921]: E0318 13:00:58.212813 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:01:05 crc kubenswrapper[4921]: I0318 13:01:05.683795 4921 scope.go:117] "RemoveContainer" containerID="f0be9b609c6b7cc4dadd8c922a104b3e0c6790d23212c69978f6e8913ce75db1" Mar 18 13:01:09 crc kubenswrapper[4921]: I0318 13:01:09.209339 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:01:09 crc kubenswrapper[4921]: E0318 13:01:09.209897 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:01:23 crc kubenswrapper[4921]: I0318 13:01:23.208833 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:01:23 crc kubenswrapper[4921]: E0318 13:01:23.209691 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:01:38 crc kubenswrapper[4921]: I0318 13:01:38.209425 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:01:38 crc kubenswrapper[4921]: E0318 13:01:38.210063 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:01:52 crc kubenswrapper[4921]: I0318 13:01:52.209345 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:01:52 crc kubenswrapper[4921]: E0318 13:01:52.210149 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.148598 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563982-zb644"] Mar 18 13:02:00 crc kubenswrapper[4921]: E0318 13:02:00.150563 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebf67d9-23d4-42bc-8493-5558214b76b3" containerName="oc" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.150661 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebf67d9-23d4-42bc-8493-5558214b76b3" containerName="oc" Mar 18 13:02:00 crc kubenswrapper[4921]: E0318 13:02:00.150741 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e2501f-5413-44f3-beba-c05c5cd108ab" containerName="collect-profiles" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.150813 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e2501f-5413-44f3-beba-c05c5cd108ab" containerName="collect-profiles" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.151013 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e2501f-5413-44f3-beba-c05c5cd108ab" containerName="collect-profiles" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.151088 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ebf67d9-23d4-42bc-8493-5558214b76b3" containerName="oc" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.151673 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.155990 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.156211 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.157594 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.162485 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-zb644"] Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.322313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczrk\" (UniqueName: \"kubernetes.io/projected/701ce28f-e7a4-493d-94ec-db21d0569e2c-kube-api-access-rczrk\") pod \"auto-csr-approver-29563982-zb644\" (UID: \"701ce28f-e7a4-493d-94ec-db21d0569e2c\") " pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.424562 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczrk\" (UniqueName: \"kubernetes.io/projected/701ce28f-e7a4-493d-94ec-db21d0569e2c-kube-api-access-rczrk\") pod \"auto-csr-approver-29563982-zb644\" (UID: \"701ce28f-e7a4-493d-94ec-db21d0569e2c\") " pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.442809 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczrk\" (UniqueName: \"kubernetes.io/projected/701ce28f-e7a4-493d-94ec-db21d0569e2c-kube-api-access-rczrk\") pod \"auto-csr-approver-29563982-zb644\" (UID: \"701ce28f-e7a4-493d-94ec-db21d0569e2c\") " pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.477567 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:00 crc kubenswrapper[4921]: I0318 13:02:00.913022 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-zb644"] Mar 18 13:02:00 crc kubenswrapper[4921]: W0318 13:02:00.920935 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701ce28f_e7a4_493d_94ec_db21d0569e2c.slice/crio-b14e72eb943a123ecae847fdfbb8092c07302a8ff046ae1d273510157bdbe309 WatchSource:0}: Error finding container b14e72eb943a123ecae847fdfbb8092c07302a8ff046ae1d273510157bdbe309: Status 404 returned error can't find the container with id b14e72eb943a123ecae847fdfbb8092c07302a8ff046ae1d273510157bdbe309 Mar 18 13:02:01 crc kubenswrapper[4921]: I0318 13:02:01.093617 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-zb644" event={"ID":"701ce28f-e7a4-493d-94ec-db21d0569e2c","Type":"ContainerStarted","Data":"b14e72eb943a123ecae847fdfbb8092c07302a8ff046ae1d273510157bdbe309"} Mar 18 13:02:03 crc kubenswrapper[4921]: I0318 13:02:03.108257 4921 generic.go:334] "Generic (PLEG): container finished" podID="701ce28f-e7a4-493d-94ec-db21d0569e2c" containerID="83154aa6cebb8e0003afad517f1feb77846dabe121d4df1b80c6ed17984b44c4" exitCode=0 Mar 18 13:02:03 crc kubenswrapper[4921]: I0318 13:02:03.108321 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-zb644" event={"ID":"701ce28f-e7a4-493d-94ec-db21d0569e2c","Type":"ContainerDied","Data":"83154aa6cebb8e0003afad517f1feb77846dabe121d4df1b80c6ed17984b44c4"} Mar 18 13:02:04 crc kubenswrapper[4921]: I0318 13:02:04.407902 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:04 crc kubenswrapper[4921]: I0318 13:02:04.591252 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczrk\" (UniqueName: \"kubernetes.io/projected/701ce28f-e7a4-493d-94ec-db21d0569e2c-kube-api-access-rczrk\") pod \"701ce28f-e7a4-493d-94ec-db21d0569e2c\" (UID: \"701ce28f-e7a4-493d-94ec-db21d0569e2c\") " Mar 18 13:02:04 crc kubenswrapper[4921]: I0318 13:02:04.597376 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701ce28f-e7a4-493d-94ec-db21d0569e2c-kube-api-access-rczrk" (OuterVolumeSpecName: "kube-api-access-rczrk") pod "701ce28f-e7a4-493d-94ec-db21d0569e2c" (UID: "701ce28f-e7a4-493d-94ec-db21d0569e2c"). InnerVolumeSpecName "kube-api-access-rczrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:02:04 crc kubenswrapper[4921]: I0318 13:02:04.693778 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczrk\" (UniqueName: \"kubernetes.io/projected/701ce28f-e7a4-493d-94ec-db21d0569e2c-kube-api-access-rczrk\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:05 crc kubenswrapper[4921]: I0318 13:02:05.124760 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-zb644" event={"ID":"701ce28f-e7a4-493d-94ec-db21d0569e2c","Type":"ContainerDied","Data":"b14e72eb943a123ecae847fdfbb8092c07302a8ff046ae1d273510157bdbe309"} Mar 18 13:02:05 crc kubenswrapper[4921]: I0318 13:02:05.125047 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b14e72eb943a123ecae847fdfbb8092c07302a8ff046ae1d273510157bdbe309" Mar 18 13:02:05 crc kubenswrapper[4921]: I0318 13:02:05.124826 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-zb644" Mar 18 13:02:05 crc kubenswrapper[4921]: I0318 13:02:05.486354 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-58slt"] Mar 18 13:02:05 crc kubenswrapper[4921]: I0318 13:02:05.491521 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-58slt"] Mar 18 13:02:07 crc kubenswrapper[4921]: I0318 13:02:07.210871 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:02:07 crc kubenswrapper[4921]: E0318 13:02:07.211719 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:02:07 crc kubenswrapper[4921]: I0318 13:02:07.217975 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b106cf1-12b8-4ed6-80d2-d60adecf8463" path="/var/lib/kubelet/pods/8b106cf1-12b8-4ed6-80d2-d60adecf8463/volumes" Mar 18 13:02:18 crc kubenswrapper[4921]: I0318 13:02:18.213885 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:02:18 crc kubenswrapper[4921]: E0318 13:02:18.217332 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:02:29 crc kubenswrapper[4921]: I0318 13:02:29.209564 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:02:29 crc kubenswrapper[4921]: E0318 13:02:29.210510 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:02:41 crc kubenswrapper[4921]: I0318 13:02:41.212799 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:02:41 crc kubenswrapper[4921]: E0318 13:02:41.213480 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.470333 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkpsd"] Mar 18 13:02:44 crc kubenswrapper[4921]: E0318 13:02:44.470900 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701ce28f-e7a4-493d-94ec-db21d0569e2c" containerName="oc" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.470912 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="701ce28f-e7a4-493d-94ec-db21d0569e2c" containerName="oc" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.471067 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="701ce28f-e7a4-493d-94ec-db21d0569e2c" containerName="oc" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.472023 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.478285 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkpsd"] Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.656281 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-catalog-content\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.656419 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-utilities\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.656510 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw9n8\" (UniqueName: \"kubernetes.io/projected/9abc9a8f-c31f-4659-b129-496169835e6e-kube-api-access-pw9n8\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.757836 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw9n8\" (UniqueName: \"kubernetes.io/projected/9abc9a8f-c31f-4659-b129-496169835e6e-kube-api-access-pw9n8\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.757924 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-catalog-content\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.757986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-utilities\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.758446 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-utilities\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.758966 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-catalog-content\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.780311 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw9n8\" (UniqueName: \"kubernetes.io/projected/9abc9a8f-c31f-4659-b129-496169835e6e-kube-api-access-pw9n8\") pod \"redhat-operators-zkpsd\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:44 crc kubenswrapper[4921]: I0318 13:02:44.792614 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:45 crc kubenswrapper[4921]: I0318 13:02:45.222294 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkpsd"] Mar 18 13:02:45 crc kubenswrapper[4921]: I0318 13:02:45.427024 4921 generic.go:334] "Generic (PLEG): container finished" podID="9abc9a8f-c31f-4659-b129-496169835e6e" containerID="8a7fe07dfe19be0834af0b05aee61f0417590a9212ab896fb02d20cf2d292a05" exitCode=0 Mar 18 13:02:45 crc kubenswrapper[4921]: I0318 13:02:45.427182 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkpsd" event={"ID":"9abc9a8f-c31f-4659-b129-496169835e6e","Type":"ContainerDied","Data":"8a7fe07dfe19be0834af0b05aee61f0417590a9212ab896fb02d20cf2d292a05"} Mar 18 13:02:45 crc kubenswrapper[4921]: I0318 13:02:45.427384 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkpsd" event={"ID":"9abc9a8f-c31f-4659-b129-496169835e6e","Type":"ContainerStarted","Data":"db64823bace6641fcda459f24914c1f1fd310720128f4e12616036be59f256a9"} Mar 18 13:02:47 crc kubenswrapper[4921]: I0318 13:02:47.444715 4921 generic.go:334] "Generic (PLEG): container finished" podID="9abc9a8f-c31f-4659-b129-496169835e6e" containerID="5e452d43c2e44473324c1f31c0f2231610b2f0700971f5ebaf3bf995099b1a52" exitCode=0 Mar 18 13:02:47 crc kubenswrapper[4921]: I0318 13:02:47.444790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkpsd" event={"ID":"9abc9a8f-c31f-4659-b129-496169835e6e","Type":"ContainerDied","Data":"5e452d43c2e44473324c1f31c0f2231610b2f0700971f5ebaf3bf995099b1a52"} Mar 18 13:02:48 crc kubenswrapper[4921]: I0318 13:02:48.456207 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkpsd" event={"ID":"9abc9a8f-c31f-4659-b129-496169835e6e","Type":"ContainerStarted","Data":"1ed71ce1ed5ab68337e6c3a6c46044ada43c6211bb515b5c1be1c89c5b18ad7f"} Mar 18 13:02:48 crc kubenswrapper[4921]: I0318 13:02:48.482660 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkpsd" podStartSLOduration=1.997531762 podStartE2EDuration="4.482643351s" podCreationTimestamp="2026-03-18 13:02:44 +0000 UTC" firstStartedPulling="2026-03-18 13:02:45.42874278 +0000 UTC m=+3184.978663419" lastFinishedPulling="2026-03-18 13:02:47.913854369 +0000 UTC m=+3187.463775008" observedRunningTime="2026-03-18 13:02:48.479163112 +0000 UTC m=+3188.029083751" watchObservedRunningTime="2026-03-18 13:02:48.482643351 +0000 UTC m=+3188.032563990" Mar 18 13:02:54 crc kubenswrapper[4921]: I0318 13:02:54.209502 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:02:54 crc kubenswrapper[4921]: E0318 13:02:54.210234 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:02:54 crc kubenswrapper[4921]: I0318 13:02:54.793374 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:54 crc kubenswrapper[4921]: I0318 13:02:54.793416 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:54 crc kubenswrapper[4921]: I0318 13:02:54.835962 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:55 crc kubenswrapper[4921]: I0318 13:02:55.543317 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:55 crc kubenswrapper[4921]: I0318 13:02:55.587656 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkpsd"] Mar 18 13:02:57 crc kubenswrapper[4921]: I0318 13:02:57.510276 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zkpsd" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="registry-server" containerID="cri-o://1ed71ce1ed5ab68337e6c3a6c46044ada43c6211bb515b5c1be1c89c5b18ad7f" gracePeriod=2 Mar 18 13:02:58 crc kubenswrapper[4921]: I0318 13:02:58.519581 4921 generic.go:334] "Generic (PLEG): container finished" podID="9abc9a8f-c31f-4659-b129-496169835e6e" containerID="1ed71ce1ed5ab68337e6c3a6c46044ada43c6211bb515b5c1be1c89c5b18ad7f" exitCode=0 Mar 18 13:02:58 crc kubenswrapper[4921]: I0318 13:02:58.519636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkpsd" event={"ID":"9abc9a8f-c31f-4659-b129-496169835e6e","Type":"ContainerDied","Data":"1ed71ce1ed5ab68337e6c3a6c46044ada43c6211bb515b5c1be1c89c5b18ad7f"} Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.004640 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.176784 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-catalog-content\") pod \"9abc9a8f-c31f-4659-b129-496169835e6e\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.176905 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-utilities\") pod \"9abc9a8f-c31f-4659-b129-496169835e6e\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.177010 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw9n8\" (UniqueName: \"kubernetes.io/projected/9abc9a8f-c31f-4659-b129-496169835e6e-kube-api-access-pw9n8\") pod \"9abc9a8f-c31f-4659-b129-496169835e6e\" (UID: \"9abc9a8f-c31f-4659-b129-496169835e6e\") " Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.178197 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-utilities" (OuterVolumeSpecName: "utilities") pod "9abc9a8f-c31f-4659-b129-496169835e6e" (UID: "9abc9a8f-c31f-4659-b129-496169835e6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.189768 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abc9a8f-c31f-4659-b129-496169835e6e-kube-api-access-pw9n8" (OuterVolumeSpecName: "kube-api-access-pw9n8") pod "9abc9a8f-c31f-4659-b129-496169835e6e" (UID: "9abc9a8f-c31f-4659-b129-496169835e6e"). InnerVolumeSpecName "kube-api-access-pw9n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.279364 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw9n8\" (UniqueName: \"kubernetes.io/projected/9abc9a8f-c31f-4659-b129-496169835e6e-kube-api-access-pw9n8\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.279416 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.321069 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9abc9a8f-c31f-4659-b129-496169835e6e" (UID: "9abc9a8f-c31f-4659-b129-496169835e6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.380042 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abc9a8f-c31f-4659-b129-496169835e6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.528932 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkpsd" event={"ID":"9abc9a8f-c31f-4659-b129-496169835e6e","Type":"ContainerDied","Data":"db64823bace6641fcda459f24914c1f1fd310720128f4e12616036be59f256a9"} Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.528990 4921 scope.go:117] "RemoveContainer" containerID="1ed71ce1ed5ab68337e6c3a6c46044ada43c6211bb515b5c1be1c89c5b18ad7f" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.529936 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkpsd" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.548783 4921 scope.go:117] "RemoveContainer" containerID="5e452d43c2e44473324c1f31c0f2231610b2f0700971f5ebaf3bf995099b1a52" Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.564138 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkpsd"] Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.571195 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zkpsd"] Mar 18 13:02:59 crc kubenswrapper[4921]: I0318 13:02:59.587819 4921 scope.go:117] "RemoveContainer" containerID="8a7fe07dfe19be0834af0b05aee61f0417590a9212ab896fb02d20cf2d292a05" Mar 18 13:03:01 crc kubenswrapper[4921]: I0318 13:03:01.217926 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" path="/var/lib/kubelet/pods/9abc9a8f-c31f-4659-b129-496169835e6e/volumes" Mar 18 13:03:05 crc kubenswrapper[4921]: I0318 13:03:05.757845 4921 scope.go:117] "RemoveContainer" containerID="c2926129032f1385a7e1aafb4f41d9dad0d5abab84fda6b99aa8ec375cf84e39" Mar 18 13:03:08 crc kubenswrapper[4921]: I0318 13:03:08.209627 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:03:08 crc kubenswrapper[4921]: E0318 13:03:08.210208 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:03:20 crc kubenswrapper[4921]: I0318 13:03:20.209210 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:03:20 crc kubenswrapper[4921]: I0318 13:03:20.684584 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"72467abf55e7293f070703b0f52be49140e85e78bcece09c3bb1e39d180d6c03"} Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.174370 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563984-tq8sk"] Mar 18 13:04:00 crc kubenswrapper[4921]: E0318 13:04:00.175209 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.175223 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4921]: E0318 13:04:00.175235 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="extract-content" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.175241 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="extract-content" Mar 18 13:04:00 crc kubenswrapper[4921]: E0318 13:04:00.175249 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="extract-utilities" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.175257 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="extract-utilities" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.175408 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9abc9a8f-c31f-4659-b129-496169835e6e" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.175861 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.178564 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.179135 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.179884 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.183468 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-tq8sk"] Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.307354 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5mdn\" (UniqueName: \"kubernetes.io/projected/1fe0f8a7-7900-4fc7-8aef-30990c493a6b-kube-api-access-z5mdn\") pod \"auto-csr-approver-29563984-tq8sk\" (UID: \"1fe0f8a7-7900-4fc7-8aef-30990c493a6b\") " pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.408645 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5mdn\" (UniqueName: \"kubernetes.io/projected/1fe0f8a7-7900-4fc7-8aef-30990c493a6b-kube-api-access-z5mdn\") pod \"auto-csr-approver-29563984-tq8sk\" (UID: \"1fe0f8a7-7900-4fc7-8aef-30990c493a6b\") " pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.428165 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5mdn\" (UniqueName: \"kubernetes.io/projected/1fe0f8a7-7900-4fc7-8aef-30990c493a6b-kube-api-access-z5mdn\") pod \"auto-csr-approver-29563984-tq8sk\" (UID: \"1fe0f8a7-7900-4fc7-8aef-30990c493a6b\") " pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.493178 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.934030 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-tq8sk"] Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.944765 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:04:00 crc kubenswrapper[4921]: I0318 13:04:00.985894 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" event={"ID":"1fe0f8a7-7900-4fc7-8aef-30990c493a6b","Type":"ContainerStarted","Data":"299249ed14c1f0bb0ffd042240a07d8c9327886e4d92703f59cabd7666ffcda6"} Mar 18 13:04:03 crc kubenswrapper[4921]: I0318 13:04:03.001373 4921 generic.go:334] "Generic (PLEG): container finished" podID="1fe0f8a7-7900-4fc7-8aef-30990c493a6b" containerID="e0440f9ecc6ecc33c53e8933a070729423cb99a784ee36f13e59223801e44ddf" exitCode=0 Mar 18 13:04:03 crc kubenswrapper[4921]: I0318 13:04:03.001579 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" event={"ID":"1fe0f8a7-7900-4fc7-8aef-30990c493a6b","Type":"ContainerDied","Data":"e0440f9ecc6ecc33c53e8933a070729423cb99a784ee36f13e59223801e44ddf"} Mar 18 13:04:04 crc kubenswrapper[4921]: I0318 13:04:04.247972 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:04 crc kubenswrapper[4921]: I0318 13:04:04.368050 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5mdn\" (UniqueName: \"kubernetes.io/projected/1fe0f8a7-7900-4fc7-8aef-30990c493a6b-kube-api-access-z5mdn\") pod \"1fe0f8a7-7900-4fc7-8aef-30990c493a6b\" (UID: \"1fe0f8a7-7900-4fc7-8aef-30990c493a6b\") " Mar 18 13:04:04 crc kubenswrapper[4921]: I0318 13:04:04.375074 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe0f8a7-7900-4fc7-8aef-30990c493a6b-kube-api-access-z5mdn" (OuterVolumeSpecName: "kube-api-access-z5mdn") pod "1fe0f8a7-7900-4fc7-8aef-30990c493a6b" (UID: "1fe0f8a7-7900-4fc7-8aef-30990c493a6b"). InnerVolumeSpecName "kube-api-access-z5mdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:04 crc kubenswrapper[4921]: I0318 13:04:04.470289 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5mdn\" (UniqueName: \"kubernetes.io/projected/1fe0f8a7-7900-4fc7-8aef-30990c493a6b-kube-api-access-z5mdn\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:05 crc kubenswrapper[4921]: I0318 13:04:05.051772 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" event={"ID":"1fe0f8a7-7900-4fc7-8aef-30990c493a6b","Type":"ContainerDied","Data":"299249ed14c1f0bb0ffd042240a07d8c9327886e4d92703f59cabd7666ffcda6"} Mar 18 13:04:05 crc kubenswrapper[4921]: I0318 13:04:05.052135 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299249ed14c1f0bb0ffd042240a07d8c9327886e4d92703f59cabd7666ffcda6" Mar 18 13:04:05 crc kubenswrapper[4921]: I0318 13:04:05.051848 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-tq8sk" Mar 18 13:04:05 crc kubenswrapper[4921]: I0318 13:04:05.313699 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-zmmbx"] Mar 18 13:04:05 crc kubenswrapper[4921]: I0318 13:04:05.319019 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-zmmbx"] Mar 18 13:04:07 crc kubenswrapper[4921]: I0318 13:04:07.218380 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7" path="/var/lib/kubelet/pods/a32d9a3b-d8a0-4c27-8ce3-0662b22c48a7/volumes" Mar 18 13:05:05 crc kubenswrapper[4921]: I0318 13:05:05.858944 4921 scope.go:117] "RemoveContainer" containerID="3cbc1e0884d96adee4b546de26bd329cc7cd75736ec203eb3a3b0cf84d58b3ac" Mar 18 13:05:47 crc kubenswrapper[4921]: I0318 13:05:47.081163 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:05:47 crc kubenswrapper[4921]: I0318 13:05:47.082791 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.154617 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563986-84lsj"] Mar 18 13:06:00 crc kubenswrapper[4921]: E0318 13:06:00.155662 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe0f8a7-7900-4fc7-8aef-30990c493a6b" containerName="oc" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.155681 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe0f8a7-7900-4fc7-8aef-30990c493a6b" containerName="oc" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.155897 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe0f8a7-7900-4fc7-8aef-30990c493a6b" containerName="oc" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.156483 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.160463 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.161658 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-84lsj"] Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.163130 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.163440 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.354801 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v24r\" (UniqueName: \"kubernetes.io/projected/0714299b-200c-4333-ad4b-b925ebeac926-kube-api-access-5v24r\") pod \"auto-csr-approver-29563986-84lsj\" (UID: \"0714299b-200c-4333-ad4b-b925ebeac926\") " pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.456006 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v24r\" (UniqueName: \"kubernetes.io/projected/0714299b-200c-4333-ad4b-b925ebeac926-kube-api-access-5v24r\") pod \"auto-csr-approver-29563986-84lsj\" (UID: \"0714299b-200c-4333-ad4b-b925ebeac926\") " pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.481996 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v24r\" (UniqueName: \"kubernetes.io/projected/0714299b-200c-4333-ad4b-b925ebeac926-kube-api-access-5v24r\") pod \"auto-csr-approver-29563986-84lsj\" (UID: \"0714299b-200c-4333-ad4b-b925ebeac926\") " pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:00 crc kubenswrapper[4921]: I0318 13:06:00.774977 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:01 crc kubenswrapper[4921]: I0318 13:06:01.200783 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-84lsj"] Mar 18 13:06:01 crc kubenswrapper[4921]: W0318 13:06:01.209569 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0714299b_200c_4333_ad4b_b925ebeac926.slice/crio-9bfd3c4190628f7911f8a91c4312525d94a387e214bc2088acd1e55d5a644e06 WatchSource:0}: Error finding container 9bfd3c4190628f7911f8a91c4312525d94a387e214bc2088acd1e55d5a644e06: Status 404 returned error can't find the container with id 9bfd3c4190628f7911f8a91c4312525d94a387e214bc2088acd1e55d5a644e06 Mar 18 13:06:01 crc kubenswrapper[4921]: I0318 13:06:01.895215 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-84lsj" event={"ID":"0714299b-200c-4333-ad4b-b925ebeac926","Type":"ContainerStarted","Data":"9bfd3c4190628f7911f8a91c4312525d94a387e214bc2088acd1e55d5a644e06"} Mar 18 13:06:02 crc kubenswrapper[4921]: I0318 13:06:02.908546 4921 generic.go:334] "Generic (PLEG): container finished" podID="0714299b-200c-4333-ad4b-b925ebeac926" containerID="331b5f285a5d86be19fd494a86afd73ab22c3b289871fa8852240cfad5a60fb5" exitCode=0 Mar 18 13:06:02 crc kubenswrapper[4921]: I0318 13:06:02.908636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-84lsj" event={"ID":"0714299b-200c-4333-ad4b-b925ebeac926","Type":"ContainerDied","Data":"331b5f285a5d86be19fd494a86afd73ab22c3b289871fa8852240cfad5a60fb5"} Mar 18 13:06:05 crc kubenswrapper[4921]: I0318 13:06:05.935404 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:06 crc kubenswrapper[4921]: I0318 13:06:06.049101 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v24r\" (UniqueName: \"kubernetes.io/projected/0714299b-200c-4333-ad4b-b925ebeac926-kube-api-access-5v24r\") pod \"0714299b-200c-4333-ad4b-b925ebeac926\" (UID: \"0714299b-200c-4333-ad4b-b925ebeac926\") " Mar 18 13:06:06 crc kubenswrapper[4921]: I0318 13:06:06.056249 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0714299b-200c-4333-ad4b-b925ebeac926-kube-api-access-5v24r" (OuterVolumeSpecName: "kube-api-access-5v24r") pod "0714299b-200c-4333-ad4b-b925ebeac926" (UID: "0714299b-200c-4333-ad4b-b925ebeac926"). InnerVolumeSpecName "kube-api-access-5v24r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:06 crc kubenswrapper[4921]: I0318 13:06:06.150952 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v24r\" (UniqueName: \"kubernetes.io/projected/0714299b-200c-4333-ad4b-b925ebeac926-kube-api-access-5v24r\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:06 crc kubenswrapper[4921]: I0318 13:06:06.676034 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-84lsj" event={"ID":"0714299b-200c-4333-ad4b-b925ebeac926","Type":"ContainerDied","Data":"9bfd3c4190628f7911f8a91c4312525d94a387e214bc2088acd1e55d5a644e06"} Mar 18 13:06:06 crc kubenswrapper[4921]: I0318 13:06:06.676078 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bfd3c4190628f7911f8a91c4312525d94a387e214bc2088acd1e55d5a644e06" Mar 18 13:06:06 crc kubenswrapper[4921]: I0318 13:06:06.676085 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-84lsj" Mar 18 13:06:07 crc kubenswrapper[4921]: I0318 13:06:07.007211 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-jp58n"] Mar 18 13:06:07 crc kubenswrapper[4921]: I0318 13:06:07.013689 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-jp58n"] Mar 18 13:06:07 crc kubenswrapper[4921]: I0318 13:06:07.217948 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ebf67d9-23d4-42bc-8493-5558214b76b3" path="/var/lib/kubelet/pods/2ebf67d9-23d4-42bc-8493-5558214b76b3/volumes" Mar 18 13:06:17 crc kubenswrapper[4921]: I0318 13:06:17.080963 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:06:17 crc kubenswrapper[4921]: I0318 13:06:17.081620 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.747203 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d9975"] Mar 18 13:06:33 crc kubenswrapper[4921]: E0318 13:06:33.748153 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0714299b-200c-4333-ad4b-b925ebeac926" containerName="oc" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.748167 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0714299b-200c-4333-ad4b-b925ebeac926" containerName="oc" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.748346 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0714299b-200c-4333-ad4b-b925ebeac926" containerName="oc" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.749561 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.768400 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9975"] Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.847041 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99s42\" (UniqueName: \"kubernetes.io/projected/b52bdde8-6514-412b-832a-ae8635442ef7-kube-api-access-99s42\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.847098 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-catalog-content\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.847154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-utilities\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.948234 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-utilities\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.948378 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99s42\" (UniqueName: \"kubernetes.io/projected/b52bdde8-6514-412b-832a-ae8635442ef7-kube-api-access-99s42\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.948409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-catalog-content\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.948920 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-catalog-content\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.948933 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-utilities\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:33 crc kubenswrapper[4921]: I0318 13:06:33.975129 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99s42\" (UniqueName: \"kubernetes.io/projected/b52bdde8-6514-412b-832a-ae8635442ef7-kube-api-access-99s42\") pod \"redhat-marketplace-d9975\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:34 crc kubenswrapper[4921]: I0318 13:06:34.069034 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:34 crc kubenswrapper[4921]: I0318 13:06:34.531750 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9975"] Mar 18 13:06:34 crc kubenswrapper[4921]: I0318 13:06:34.871492 4921 generic.go:334] "Generic (PLEG): container finished" podID="b52bdde8-6514-412b-832a-ae8635442ef7" containerID="fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e" exitCode=0 Mar 18 13:06:34 crc kubenswrapper[4921]: I0318 13:06:34.871550 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9975" event={"ID":"b52bdde8-6514-412b-832a-ae8635442ef7","Type":"ContainerDied","Data":"fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e"} Mar 18 13:06:34 crc kubenswrapper[4921]: I0318 13:06:34.871581 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9975" event={"ID":"b52bdde8-6514-412b-832a-ae8635442ef7","Type":"ContainerStarted","Data":"96e28b25f132e482227218f6a19972aaad96e10da71a3b38be9de0ff990cab51"} Mar 18 13:06:35 crc kubenswrapper[4921]: I0318 13:06:35.883070 4921 generic.go:334] "Generic (PLEG): container finished" podID="b52bdde8-6514-412b-832a-ae8635442ef7" containerID="8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e" exitCode=0 Mar 18 13:06:35 crc kubenswrapper[4921]: I0318 13:06:35.883150 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9975" event={"ID":"b52bdde8-6514-412b-832a-ae8635442ef7","Type":"ContainerDied","Data":"8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e"} Mar 18 13:06:36 crc kubenswrapper[4921]: I0318 13:06:36.893136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9975" event={"ID":"b52bdde8-6514-412b-832a-ae8635442ef7","Type":"ContainerStarted","Data":"1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d"} Mar 18 13:06:36 crc kubenswrapper[4921]: I0318 13:06:36.917353 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d9975" podStartSLOduration=2.49707738 podStartE2EDuration="3.917329807s" podCreationTimestamp="2026-03-18 13:06:33 +0000 UTC" firstStartedPulling="2026-03-18 13:06:34.872947657 +0000 UTC m=+3414.422868296" lastFinishedPulling="2026-03-18 13:06:36.293200094 +0000 UTC m=+3415.843120723" observedRunningTime="2026-03-18 13:06:36.913667423 +0000 UTC m=+3416.463588082" watchObservedRunningTime="2026-03-18 13:06:36.917329807 +0000 UTC m=+3416.467250446" Mar 18 13:06:44 crc kubenswrapper[4921]: I0318 13:06:44.069817 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:44 crc kubenswrapper[4921]: I0318 13:06:44.071899 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:44 crc kubenswrapper[4921]: I0318 13:06:44.116758 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:45 crc kubenswrapper[4921]: I0318 13:06:45.006868 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:45 crc kubenswrapper[4921]: I0318 13:06:45.063635 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9975"] Mar 18 13:06:46 crc kubenswrapper[4921]: I0318 13:06:46.970591 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d9975" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="registry-server" containerID="cri-o://1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d" gracePeriod=2 Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.081776 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.081856 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.081927 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.082699 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72467abf55e7293f070703b0f52be49140e85e78bcece09c3bb1e39d180d6c03"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.082763 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://72467abf55e7293f070703b0f52be49140e85e78bcece09c3bb1e39d180d6c03" gracePeriod=600 Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.436498 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.562162 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99s42\" (UniqueName: \"kubernetes.io/projected/b52bdde8-6514-412b-832a-ae8635442ef7-kube-api-access-99s42\") pod \"b52bdde8-6514-412b-832a-ae8635442ef7\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.562341 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-utilities\") pod \"b52bdde8-6514-412b-832a-ae8635442ef7\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.562408 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-catalog-content\") pod \"b52bdde8-6514-412b-832a-ae8635442ef7\" (UID: \"b52bdde8-6514-412b-832a-ae8635442ef7\") " Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.563454 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-utilities" (OuterVolumeSpecName: "utilities") pod "b52bdde8-6514-412b-832a-ae8635442ef7" (UID: "b52bdde8-6514-412b-832a-ae8635442ef7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.581290 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b52bdde8-6514-412b-832a-ae8635442ef7-kube-api-access-99s42" (OuterVolumeSpecName: "kube-api-access-99s42") pod "b52bdde8-6514-412b-832a-ae8635442ef7" (UID: "b52bdde8-6514-412b-832a-ae8635442ef7"). InnerVolumeSpecName "kube-api-access-99s42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.591081 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b52bdde8-6514-412b-832a-ae8635442ef7" (UID: "b52bdde8-6514-412b-832a-ae8635442ef7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.663872 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.663930 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b52bdde8-6514-412b-832a-ae8635442ef7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.663946 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99s42\" (UniqueName: \"kubernetes.io/projected/b52bdde8-6514-412b-832a-ae8635442ef7-kube-api-access-99s42\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.981177 4921 generic.go:334] "Generic (PLEG): container finished" podID="b52bdde8-6514-412b-832a-ae8635442ef7" containerID="1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d" exitCode=0 Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.981284 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9975" event={"ID":"b52bdde8-6514-412b-832a-ae8635442ef7","Type":"ContainerDied","Data":"1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d"} Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.981627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9975" event={"ID":"b52bdde8-6514-412b-832a-ae8635442ef7","Type":"ContainerDied","Data":"96e28b25f132e482227218f6a19972aaad96e10da71a3b38be9de0ff990cab51"} Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.981331 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9975" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.981658 4921 scope.go:117] "RemoveContainer" containerID="1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d" Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.987542 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="72467abf55e7293f070703b0f52be49140e85e78bcece09c3bb1e39d180d6c03" exitCode=0 Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.987606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"72467abf55e7293f070703b0f52be49140e85e78bcece09c3bb1e39d180d6c03"} Mar 18 13:06:47 crc kubenswrapper[4921]: I0318 13:06:47.987643 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691"} Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.000248 4921 scope.go:117] "RemoveContainer" containerID="8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.034079 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9975"] Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.034310 4921 scope.go:117] "RemoveContainer" containerID="fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.044078 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9975"] Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.056098 4921 scope.go:117] "RemoveContainer" containerID="1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d" Mar 18 13:06:48 crc kubenswrapper[4921]: E0318 13:06:48.056817 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d\": container with ID starting with 1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d not found: ID does not exist" containerID="1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.056854 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d"} err="failed to get container status \"1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d\": rpc error: code = NotFound desc = could not find container \"1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d\": container with ID starting with 1c359f0d15361a119d346a855e367d6758f2892db5467fd82ba699a39199c19d not found: ID does not exist" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.056882 4921 scope.go:117] "RemoveContainer" containerID="8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e" Mar 18 13:06:48 crc kubenswrapper[4921]: E0318 13:06:48.057299 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e\": container with ID starting with 8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e not found: ID does not exist" containerID="8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.057328 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e"} err="failed to get container status \"8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e\": rpc error: code = NotFound desc = could not find container \"8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e\": container with ID starting with 8a791eed612fe78cc15e15fd8c6739c312ad0ec7b4507dd9a1532a796d6f416e not found: ID does not exist" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.057346 4921 scope.go:117] "RemoveContainer" containerID="fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e" Mar 18 13:06:48 crc kubenswrapper[4921]: E0318 13:06:48.057573 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e\": container with ID starting with fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e not found: ID does not exist" containerID="fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.057604 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e"} err="failed to get container status \"fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e\": rpc error: code = NotFound desc = could not find container \"fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e\": container with ID starting with fe17b7e5d5f982a4a0cae9c248ff2236dbd7fc97397d5ed78d67224befae7a2e not found: ID does not exist" Mar 18 13:06:48 crc kubenswrapper[4921]: I0318 13:06:48.057629 4921 scope.go:117] "RemoveContainer" containerID="4d16bfc29f4fedefe41db9c5b4317d3b16e36478559b582b85b9914b958c9785" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.220665 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" path="/var/lib/kubelet/pods/b52bdde8-6514-412b-832a-ae8635442ef7/volumes" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.885658 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-784wq"] Mar 18 13:06:49 crc kubenswrapper[4921]: E0318 13:06:49.889397 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="extract-utilities" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.889433 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="extract-utilities" Mar 18 13:06:49 crc kubenswrapper[4921]: E0318 13:06:49.889447 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="extract-content" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.889455 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="extract-content" Mar 18 13:06:49 crc kubenswrapper[4921]: E0318 13:06:49.889469 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="registry-server" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.889477 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="registry-server" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.889681 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b52bdde8-6514-412b-832a-ae8635442ef7" containerName="registry-server" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.890844 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.896039 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-784wq"] Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.998448 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-catalog-content\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.998654 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-utilities\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:49 crc kubenswrapper[4921]: I0318 13:06:49.998875 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5g5f\" (UniqueName: \"kubernetes.io/projected/e481e365-f209-4d89-97d4-e8915cae76d9-kube-api-access-p5g5f\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.100010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-utilities\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.100085 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5g5f\" (UniqueName: \"kubernetes.io/projected/e481e365-f209-4d89-97d4-e8915cae76d9-kube-api-access-p5g5f\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.100178 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-catalog-content\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.101191 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-utilities\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.102765 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-catalog-content\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.169880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5g5f\" (UniqueName: \"kubernetes.io/projected/e481e365-f209-4d89-97d4-e8915cae76d9-kube-api-access-p5g5f\") pod \"certified-operators-784wq\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.212961 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:06:50 crc kubenswrapper[4921]: I0318 13:06:50.591011 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-784wq"] Mar 18 13:06:51 crc kubenswrapper[4921]: I0318 13:06:51.012533 4921 generic.go:334] "Generic (PLEG): container finished" podID="e481e365-f209-4d89-97d4-e8915cae76d9" containerID="169ed20475df90743151951c7a40b03d4e3050566620a9a5bf46849bc64e37ed" exitCode=0 Mar 18 13:06:51 crc kubenswrapper[4921]: I0318 13:06:51.012652 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784wq" event={"ID":"e481e365-f209-4d89-97d4-e8915cae76d9","Type":"ContainerDied","Data":"169ed20475df90743151951c7a40b03d4e3050566620a9a5bf46849bc64e37ed"} Mar 18 13:06:51 crc kubenswrapper[4921]: I0318 13:06:51.012982 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784wq" event={"ID":"e481e365-f209-4d89-97d4-e8915cae76d9","Type":"ContainerStarted","Data":"65973151eb8fd8921cf4751ebaeace99e5827c9782b3da377d7e9390b0c10df2"} Mar 18 13:06:53 crc kubenswrapper[4921]: I0318 13:06:53.031649 4921 generic.go:334] "Generic (PLEG): container finished" podID="e481e365-f209-4d89-97d4-e8915cae76d9" containerID="9164ba1989b41afb071538872be2f64f02e2084064e66e5567dbba8e15a065fb" exitCode=0 Mar 18 13:06:53 crc kubenswrapper[4921]: I0318 13:06:53.031875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784wq" event={"ID":"e481e365-f209-4d89-97d4-e8915cae76d9","Type":"ContainerDied","Data":"9164ba1989b41afb071538872be2f64f02e2084064e66e5567dbba8e15a065fb"} Mar 18 13:06:54 crc kubenswrapper[4921]: I0318 13:06:54.044062 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784wq" event={"ID":"e481e365-f209-4d89-97d4-e8915cae76d9","Type":"ContainerStarted","Data":"2b09a476c34a8c64a6864edacc0be951e7b6225163505f15383061ab2820a7b7"} Mar 18 13:06:54 crc kubenswrapper[4921]: I0318 13:06:54.061796 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-784wq" podStartSLOduration=2.5252085380000002 podStartE2EDuration="5.061775133s" podCreationTimestamp="2026-03-18 13:06:49 +0000 UTC" firstStartedPulling="2026-03-18 13:06:51.014380691 +0000 UTC m=+3430.564301330" lastFinishedPulling="2026-03-18 13:06:53.550947286 +0000 UTC m=+3433.100867925" observedRunningTime="2026-03-18 13:06:54.059695174 +0000 UTC m=+3433.609615823" watchObservedRunningTime="2026-03-18 13:06:54.061775133 +0000 UTC m=+3433.611695772" Mar 18 13:07:00 crc kubenswrapper[4921]: I0318 13:07:00.214065 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:07:00 crc kubenswrapper[4921]: I0318 13:07:00.214793 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:07:00 crc kubenswrapper[4921]: I0318 13:07:00.251027 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:07:01 crc kubenswrapper[4921]: I0318 13:07:01.137973 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:07:01 crc kubenswrapper[4921]: I0318 13:07:01.184221 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-784wq"] Mar 18 13:07:03 crc kubenswrapper[4921]: I0318 13:07:03.216626 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-784wq" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="registry-server" containerID="cri-o://2b09a476c34a8c64a6864edacc0be951e7b6225163505f15383061ab2820a7b7" gracePeriod=2 Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.226471 4921 generic.go:334] "Generic (PLEG): container finished" podID="e481e365-f209-4d89-97d4-e8915cae76d9" containerID="2b09a476c34a8c64a6864edacc0be951e7b6225163505f15383061ab2820a7b7" exitCode=0 Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.226542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784wq" event={"ID":"e481e365-f209-4d89-97d4-e8915cae76d9","Type":"ContainerDied","Data":"2b09a476c34a8c64a6864edacc0be951e7b6225163505f15383061ab2820a7b7"} Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.305900 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.431451 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5g5f\" (UniqueName: \"kubernetes.io/projected/e481e365-f209-4d89-97d4-e8915cae76d9-kube-api-access-p5g5f\") pod \"e481e365-f209-4d89-97d4-e8915cae76d9\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.431538 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-utilities\") pod \"e481e365-f209-4d89-97d4-e8915cae76d9\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.431611 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-catalog-content\") pod \"e481e365-f209-4d89-97d4-e8915cae76d9\" (UID: \"e481e365-f209-4d89-97d4-e8915cae76d9\") " Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.432918 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-utilities" (OuterVolumeSpecName: "utilities") pod "e481e365-f209-4d89-97d4-e8915cae76d9" (UID: "e481e365-f209-4d89-97d4-e8915cae76d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.437842 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e481e365-f209-4d89-97d4-e8915cae76d9-kube-api-access-p5g5f" (OuterVolumeSpecName: "kube-api-access-p5g5f") pod "e481e365-f209-4d89-97d4-e8915cae76d9" (UID: "e481e365-f209-4d89-97d4-e8915cae76d9"). InnerVolumeSpecName "kube-api-access-p5g5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.485052 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e481e365-f209-4d89-97d4-e8915cae76d9" (UID: "e481e365-f209-4d89-97d4-e8915cae76d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.533356 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.533640 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5g5f\" (UniqueName: \"kubernetes.io/projected/e481e365-f209-4d89-97d4-e8915cae76d9-kube-api-access-p5g5f\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:04 crc kubenswrapper[4921]: I0318 13:07:04.533717 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e481e365-f209-4d89-97d4-e8915cae76d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.239624 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-784wq" event={"ID":"e481e365-f209-4d89-97d4-e8915cae76d9","Type":"ContainerDied","Data":"65973151eb8fd8921cf4751ebaeace99e5827c9782b3da377d7e9390b0c10df2"} Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.239684 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-784wq" Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.239699 4921 scope.go:117] "RemoveContainer" containerID="2b09a476c34a8c64a6864edacc0be951e7b6225163505f15383061ab2820a7b7" Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.266138 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-784wq"] Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.272793 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-784wq"] Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.273991 4921 scope.go:117] "RemoveContainer" containerID="9164ba1989b41afb071538872be2f64f02e2084064e66e5567dbba8e15a065fb" Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.293743 4921 scope.go:117] "RemoveContainer" containerID="169ed20475df90743151951c7a40b03d4e3050566620a9a5bf46849bc64e37ed" Mar 18 13:07:05 crc kubenswrapper[4921]: I0318 13:07:05.942541 4921 scope.go:117] "RemoveContainer" containerID="9a3389b29836630762ba276e55c2b3679926584590a4669e52a112dfd31bc939" Mar 18 13:07:07 crc kubenswrapper[4921]: I0318 13:07:07.226219 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" path="/var/lib/kubelet/pods/e481e365-f209-4d89-97d4-e8915cae76d9/volumes" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.154362 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563988-shx76"] Mar 18 13:08:00 crc kubenswrapper[4921]: E0318 13:08:00.155297 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="extract-content" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.155316 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="extract-content" Mar 18 13:08:00 crc kubenswrapper[4921]: E0318 13:08:00.155496 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="registry-server" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.155544 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="registry-server" Mar 18 13:08:00 crc kubenswrapper[4921]: E0318 13:08:00.155576 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="extract-utilities" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.155588 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="extract-utilities" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.155939 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e481e365-f209-4d89-97d4-e8915cae76d9" containerName="registry-server" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.156747 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.161101 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.161450 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.161656 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.164590 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-shx76"] Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.244883 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpqb\" (UniqueName: \"kubernetes.io/projected/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa-kube-api-access-rlpqb\") pod \"auto-csr-approver-29563988-shx76\" (UID: \"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa\") " pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.346287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpqb\" (UniqueName: \"kubernetes.io/projected/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa-kube-api-access-rlpqb\") pod \"auto-csr-approver-29563988-shx76\" (UID: \"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa\") " pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.366223 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpqb\" (UniqueName: \"kubernetes.io/projected/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa-kube-api-access-rlpqb\") pod \"auto-csr-approver-29563988-shx76\" (UID: \"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa\") " pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.502511 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:00 crc kubenswrapper[4921]: I0318 13:08:00.914059 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-shx76"] Mar 18 13:08:01 crc kubenswrapper[4921]: I0318 13:08:01.678555 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-shx76" event={"ID":"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa","Type":"ContainerStarted","Data":"3fd34a393ae4a82bcd96fd3fac9e5c6caca16071bb3c474f1da5ef665b95f730"} Mar 18 13:08:02 crc kubenswrapper[4921]: I0318 13:08:02.687724 4921 generic.go:334] "Generic (PLEG): container finished" podID="a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa" containerID="25252fffa055349439fbb46c753e0ac45be0c99163d5eafb3cba190386243906" exitCode=0 Mar 18 13:08:02 crc kubenswrapper[4921]: I0318 13:08:02.687784 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-shx76" event={"ID":"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa","Type":"ContainerDied","Data":"25252fffa055349439fbb46c753e0ac45be0c99163d5eafb3cba190386243906"} Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.025457 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.204566 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlpqb\" (UniqueName: \"kubernetes.io/projected/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa-kube-api-access-rlpqb\") pod \"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa\" (UID: \"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa\") " Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.210554 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa-kube-api-access-rlpqb" (OuterVolumeSpecName: "kube-api-access-rlpqb") pod "a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa" (UID: "a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa"). InnerVolumeSpecName "kube-api-access-rlpqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.306174 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlpqb\" (UniqueName: \"kubernetes.io/projected/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa-kube-api-access-rlpqb\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.705535 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-shx76" event={"ID":"a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa","Type":"ContainerDied","Data":"3fd34a393ae4a82bcd96fd3fac9e5c6caca16071bb3c474f1da5ef665b95f730"} Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.705580 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-shx76" Mar 18 13:08:04 crc kubenswrapper[4921]: I0318 13:08:04.705578 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fd34a393ae4a82bcd96fd3fac9e5c6caca16071bb3c474f1da5ef665b95f730" Mar 18 13:08:05 crc kubenswrapper[4921]: I0318 13:08:05.102266 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-zb644"] Mar 18 13:08:05 crc kubenswrapper[4921]: I0318 13:08:05.108945 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-zb644"] Mar 18 13:08:05 crc kubenswrapper[4921]: I0318 13:08:05.217673 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701ce28f-e7a4-493d-94ec-db21d0569e2c" path="/var/lib/kubelet/pods/701ce28f-e7a4-493d-94ec-db21d0569e2c/volumes" Mar 18 13:08:06 crc kubenswrapper[4921]: I0318 13:08:06.041144 4921 scope.go:117] "RemoveContainer" containerID="83154aa6cebb8e0003afad517f1feb77846dabe121d4df1b80c6ed17984b44c4" Mar 18 13:08:47 crc kubenswrapper[4921]: I0318 13:08:47.081203 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:08:47 crc kubenswrapper[4921]: I0318 13:08:47.081823 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:09:17 crc kubenswrapper[4921]: I0318 13:09:17.081231 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:09:17 crc kubenswrapper[4921]: I0318 13:09:17.081919 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.080965 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.081450 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.081500 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.082135 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.082186 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" gracePeriod=600 Mar 18 13:09:47 crc kubenswrapper[4921]: E0318 13:09:47.201071 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.450459 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" exitCode=0 Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.450506 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691"} Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.450543 4921 scope.go:117] "RemoveContainer" containerID="72467abf55e7293f070703b0f52be49140e85e78bcece09c3bb1e39d180d6c03" Mar 18 13:09:47 crc kubenswrapper[4921]: I0318 13:09:47.451047 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:09:47 crc kubenswrapper[4921]: E0318 13:09:47.451313 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.151949 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563990-sdgcp"] Mar 18 13:10:00 crc kubenswrapper[4921]: E0318 13:10:00.160621 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa" containerName="oc" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.160670 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa" containerName="oc" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.160928 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa" containerName="oc" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.161604 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.163763 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.164053 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.165386 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.165674 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-sdgcp"] Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.209443 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:10:00 crc kubenswrapper[4921]: E0318 13:10:00.209652 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.333058 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhz2c\" (UniqueName: \"kubernetes.io/projected/bbf8163f-8c37-4da7-900a-edad654804a5-kube-api-access-fhz2c\") pod \"auto-csr-approver-29563990-sdgcp\" (UID: \"bbf8163f-8c37-4da7-900a-edad654804a5\") " pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.434461 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhz2c\" (UniqueName: \"kubernetes.io/projected/bbf8163f-8c37-4da7-900a-edad654804a5-kube-api-access-fhz2c\") pod \"auto-csr-approver-29563990-sdgcp\" (UID: \"bbf8163f-8c37-4da7-900a-edad654804a5\") " pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.453706 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhz2c\" (UniqueName: \"kubernetes.io/projected/bbf8163f-8c37-4da7-900a-edad654804a5-kube-api-access-fhz2c\") pod \"auto-csr-approver-29563990-sdgcp\" (UID: \"bbf8163f-8c37-4da7-900a-edad654804a5\") " pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.498321 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.936778 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-sdgcp"] Mar 18 13:10:00 crc kubenswrapper[4921]: I0318 13:10:00.946571 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:10:01 crc kubenswrapper[4921]: I0318 13:10:01.546842 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" event={"ID":"bbf8163f-8c37-4da7-900a-edad654804a5","Type":"ContainerStarted","Data":"ba64e39c6f447200648bace98651a4d3a659e59ec7a0f2b0c8f1cc87d5e39dc2"} Mar 18 13:10:03 crc kubenswrapper[4921]: I0318 13:10:03.572174 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" event={"ID":"bbf8163f-8c37-4da7-900a-edad654804a5","Type":"ContainerStarted","Data":"c14d25496252783b3e83edb9e4d7cbfecbd5d74635a6ed65948a20db2e2b750a"} Mar 18 13:10:03 crc kubenswrapper[4921]: I0318 13:10:03.590392 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" podStartSLOduration=1.230142134 podStartE2EDuration="3.590371424s" podCreationTimestamp="2026-03-18 13:10:00 +0000 UTC" firstStartedPulling="2026-03-18 13:10:00.946274538 +0000 UTC m=+3620.496195177" lastFinishedPulling="2026-03-18 13:10:03.306503828 +0000 UTC m=+3622.856424467" observedRunningTime="2026-03-18 13:10:03.583593531 +0000 UTC m=+3623.133514180" watchObservedRunningTime="2026-03-18 13:10:03.590371424 +0000 UTC m=+3623.140292063" Mar 18 13:10:04 crc kubenswrapper[4921]: I0318 13:10:04.580524 4921 generic.go:334] "Generic (PLEG): container finished" podID="bbf8163f-8c37-4da7-900a-edad654804a5" containerID="c14d25496252783b3e83edb9e4d7cbfecbd5d74635a6ed65948a20db2e2b750a" exitCode=0 Mar 18 13:10:04 crc kubenswrapper[4921]: I0318 13:10:04.580570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" event={"ID":"bbf8163f-8c37-4da7-900a-edad654804a5","Type":"ContainerDied","Data":"c14d25496252783b3e83edb9e4d7cbfecbd5d74635a6ed65948a20db2e2b750a"} Mar 18 13:10:05 crc kubenswrapper[4921]: I0318 13:10:05.928356 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.019433 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhz2c\" (UniqueName: \"kubernetes.io/projected/bbf8163f-8c37-4da7-900a-edad654804a5-kube-api-access-fhz2c\") pod \"bbf8163f-8c37-4da7-900a-edad654804a5\" (UID: \"bbf8163f-8c37-4da7-900a-edad654804a5\") " Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.026364 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf8163f-8c37-4da7-900a-edad654804a5-kube-api-access-fhz2c" (OuterVolumeSpecName: "kube-api-access-fhz2c") pod "bbf8163f-8c37-4da7-900a-edad654804a5" (UID: "bbf8163f-8c37-4da7-900a-edad654804a5"). InnerVolumeSpecName "kube-api-access-fhz2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.120960 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhz2c\" (UniqueName: \"kubernetes.io/projected/bbf8163f-8c37-4da7-900a-edad654804a5-kube-api-access-fhz2c\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.594378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" event={"ID":"bbf8163f-8c37-4da7-900a-edad654804a5","Type":"ContainerDied","Data":"ba64e39c6f447200648bace98651a4d3a659e59ec7a0f2b0c8f1cc87d5e39dc2"} Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.594712 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba64e39c6f447200648bace98651a4d3a659e59ec7a0f2b0c8f1cc87d5e39dc2" Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.594412 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-sdgcp" Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.646852 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-tq8sk"] Mar 18 13:10:06 crc kubenswrapper[4921]: I0318 13:10:06.652595 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-tq8sk"] Mar 18 13:10:07 crc kubenswrapper[4921]: I0318 13:10:07.216791 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe0f8a7-7900-4fc7-8aef-30990c493a6b" path="/var/lib/kubelet/pods/1fe0f8a7-7900-4fc7-8aef-30990c493a6b/volumes" Mar 18 13:10:14 crc kubenswrapper[4921]: I0318 13:10:14.209372 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:10:14 crc kubenswrapper[4921]: E0318 13:10:14.210081 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.176853 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgjb5"] Mar 18 13:10:27 crc kubenswrapper[4921]: E0318 13:10:27.178807 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf8163f-8c37-4da7-900a-edad654804a5" containerName="oc" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.178836 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf8163f-8c37-4da7-900a-edad654804a5" containerName="oc" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.179142 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf8163f-8c37-4da7-900a-edad654804a5" containerName="oc" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.180660 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.194909 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgjb5"] Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.209278 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:10:27 crc kubenswrapper[4921]: E0318 13:10:27.209525 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.334943 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-utilities\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.335038 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jlb\" (UniqueName: \"kubernetes.io/projected/2075a29d-49cd-488c-b1c7-02142b1eb04a-kube-api-access-59jlb\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.335236 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-catalog-content\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.436301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-catalog-content\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.436384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-utilities\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.436431 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jlb\" (UniqueName: \"kubernetes.io/projected/2075a29d-49cd-488c-b1c7-02142b1eb04a-kube-api-access-59jlb\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.437337 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-catalog-content\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.437625 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-utilities\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.457053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jlb\" (UniqueName: \"kubernetes.io/projected/2075a29d-49cd-488c-b1c7-02142b1eb04a-kube-api-access-59jlb\") pod \"community-operators-kgjb5\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:27 crc kubenswrapper[4921]: I0318 13:10:27.507369 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:28 crc kubenswrapper[4921]: I0318 13:10:28.038794 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgjb5"] Mar 18 13:10:28 crc kubenswrapper[4921]: I0318 13:10:28.766381 4921 generic.go:334] "Generic (PLEG): container finished" podID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerID="923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7" exitCode=0 Mar 18 13:10:28 crc kubenswrapper[4921]: I0318 13:10:28.766428 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerDied","Data":"923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7"} Mar 18 13:10:28 crc kubenswrapper[4921]: I0318 13:10:28.766471 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerStarted","Data":"50d4fa0f6b3ad085663b26a0c738009de770cbecdfada2136d5c552fac5b329e"} Mar 18 13:10:29 crc kubenswrapper[4921]: I0318 13:10:29.774413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerStarted","Data":"713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3"} Mar 18 13:10:30 crc kubenswrapper[4921]: I0318 13:10:30.784548 4921 generic.go:334] "Generic (PLEG): container finished" podID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerID="713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3" exitCode=0 Mar 18 13:10:30 crc kubenswrapper[4921]: I0318 13:10:30.784595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerDied","Data":"713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3"} Mar 18 13:10:31 crc kubenswrapper[4921]: I0318 13:10:31.793748 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerStarted","Data":"26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05"} Mar 18 13:10:31 crc kubenswrapper[4921]: I0318 13:10:31.825407 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgjb5" podStartSLOduration=2.36378357 podStartE2EDuration="4.825385064s" podCreationTimestamp="2026-03-18 13:10:27 +0000 UTC" firstStartedPulling="2026-03-18 13:10:28.767615175 +0000 UTC m=+3648.317535814" lastFinishedPulling="2026-03-18 13:10:31.229216659 +0000 UTC m=+3650.779137308" observedRunningTime="2026-03-18 13:10:31.814939016 +0000 UTC m=+3651.364859675" watchObservedRunningTime="2026-03-18 13:10:31.825385064 +0000 UTC m=+3651.375305703" Mar 18 13:10:37 crc kubenswrapper[4921]: I0318 13:10:37.507710 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:37 crc kubenswrapper[4921]: I0318 13:10:37.508005 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:37 crc kubenswrapper[4921]: I0318 13:10:37.564148 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:37 crc kubenswrapper[4921]: I0318 13:10:37.887577 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:37 crc kubenswrapper[4921]: I0318 13:10:37.939683 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgjb5"] Mar 18 13:10:39 crc kubenswrapper[4921]: I0318 13:10:39.862287 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgjb5" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="registry-server" containerID="cri-o://26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05" gracePeriod=2 Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.209855 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:10:40 crc kubenswrapper[4921]: E0318 13:10:40.210796 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.291882 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.336256 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59jlb\" (UniqueName: \"kubernetes.io/projected/2075a29d-49cd-488c-b1c7-02142b1eb04a-kube-api-access-59jlb\") pod \"2075a29d-49cd-488c-b1c7-02142b1eb04a\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.336346 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-catalog-content\") pod \"2075a29d-49cd-488c-b1c7-02142b1eb04a\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.336416 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-utilities\") pod \"2075a29d-49cd-488c-b1c7-02142b1eb04a\" (UID: \"2075a29d-49cd-488c-b1c7-02142b1eb04a\") " Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.341095 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-utilities" (OuterVolumeSpecName: "utilities") pod "2075a29d-49cd-488c-b1c7-02142b1eb04a" (UID: "2075a29d-49cd-488c-b1c7-02142b1eb04a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.342193 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2075a29d-49cd-488c-b1c7-02142b1eb04a-kube-api-access-59jlb" (OuterVolumeSpecName: "kube-api-access-59jlb") pod "2075a29d-49cd-488c-b1c7-02142b1eb04a" (UID: "2075a29d-49cd-488c-b1c7-02142b1eb04a"). InnerVolumeSpecName "kube-api-access-59jlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.438188 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.438228 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59jlb\" (UniqueName: \"kubernetes.io/projected/2075a29d-49cd-488c-b1c7-02142b1eb04a-kube-api-access-59jlb\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.761899 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2075a29d-49cd-488c-b1c7-02142b1eb04a" (UID: "2075a29d-49cd-488c-b1c7-02142b1eb04a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.844845 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2075a29d-49cd-488c-b1c7-02142b1eb04a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.870916 4921 generic.go:334] "Generic (PLEG): container finished" podID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerID="26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05" exitCode=0 Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.870971 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgjb5" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.870990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerDied","Data":"26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05"} Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.871053 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgjb5" event={"ID":"2075a29d-49cd-488c-b1c7-02142b1eb04a","Type":"ContainerDied","Data":"50d4fa0f6b3ad085663b26a0c738009de770cbecdfada2136d5c552fac5b329e"} Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.871082 4921 scope.go:117] "RemoveContainer" containerID="26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.886679 4921 scope.go:117] "RemoveContainer" containerID="713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.906225 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgjb5"] Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.911349 4921 scope.go:117] "RemoveContainer" containerID="923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.911717 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kgjb5"] Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.932160 4921 scope.go:117] "RemoveContainer" containerID="26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05" Mar 18 13:10:40 crc kubenswrapper[4921]: E0318 13:10:40.934562 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05\": container with ID starting with 26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05 not found: ID does not exist" containerID="26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.934596 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05"} err="failed to get container status \"26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05\": rpc error: code = NotFound desc = could not find container \"26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05\": container with ID starting with 26720b59112cf8e1b7873d946d671dcba6f97ba73acd6b2a4d643b0142572d05 not found: ID does not exist" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.934618 4921 scope.go:117] "RemoveContainer" containerID="713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3" Mar 18 13:10:40 crc kubenswrapper[4921]: E0318 13:10:40.935040 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3\": container with ID starting with 713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3 not found: ID does not exist" containerID="713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.935094 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3"} err="failed to get container status \"713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3\": rpc error: code = NotFound desc = could not find container \"713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3\": container with ID starting with 713c0180d7183befa11757fe6e8a8f9ca12729b8f82fd9d863c631d601667cc3 not found: ID does not exist" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.935140 4921 scope.go:117] "RemoveContainer" containerID="923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7" Mar 18 13:10:40 crc kubenswrapper[4921]: E0318 13:10:40.936036 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7\": container with ID starting with 923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7 not found: ID does not exist" containerID="923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7" Mar 18 13:10:40 crc kubenswrapper[4921]: I0318 13:10:40.936070 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7"} err="failed to get container status \"923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7\": rpc error: code = NotFound desc = could not find container \"923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7\": container with ID starting with 923e06a34537f9fedb1e69bcaa7ac2b2f1e5fe71824ee98ab0af813ca61116e7 not found: ID does not exist" Mar 18 13:10:41 crc kubenswrapper[4921]: I0318 13:10:41.218754 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" path="/var/lib/kubelet/pods/2075a29d-49cd-488c-b1c7-02142b1eb04a/volumes" Mar 18 13:10:51 crc kubenswrapper[4921]: I0318 13:10:51.212777 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:10:51 crc kubenswrapper[4921]: E0318 13:10:51.214796 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:11:03 crc kubenswrapper[4921]: I0318 13:11:03.228517 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:11:03 crc kubenswrapper[4921]: E0318 13:11:03.229982 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:11:06 crc kubenswrapper[4921]: I0318 13:11:06.136412 4921 scope.go:117] "RemoveContainer" containerID="e0440f9ecc6ecc33c53e8933a070729423cb99a784ee36f13e59223801e44ddf" Mar 18 13:11:14 crc kubenswrapper[4921]: I0318 13:11:14.209372 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:11:14 crc kubenswrapper[4921]: E0318 13:11:14.210622 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:11:29 crc kubenswrapper[4921]: I0318 13:11:29.208974 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:11:29 crc kubenswrapper[4921]: E0318 13:11:29.209922 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:11:41 crc kubenswrapper[4921]: I0318 13:11:41.213157 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:11:41 crc kubenswrapper[4921]: E0318 13:11:41.214044 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:11:55 crc kubenswrapper[4921]: I0318 13:11:55.209015 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:11:55 crc kubenswrapper[4921]: E0318 13:11:55.209832 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.147626 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563992-gf76r"] Mar 18 13:12:00 crc kubenswrapper[4921]: E0318 13:12:00.154755 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="extract-utilities" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.154863 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="extract-utilities" Mar 18 13:12:00 crc kubenswrapper[4921]: E0318 13:12:00.154974 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="registry-server" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.155034 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="registry-server" Mar 18 13:12:00 crc kubenswrapper[4921]: E0318 13:12:00.155125 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="extract-content" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.155196 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="extract-content" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.155442 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2075a29d-49cd-488c-b1c7-02142b1eb04a" containerName="registry-server" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.156034 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.162661 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.162908 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.163091 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.163325 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-gf76r"] Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.275706 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhf6\" (UniqueName: \"kubernetes.io/projected/ba393e3a-e4e4-4432-a5fe-08e5592f7d06-kube-api-access-hlhf6\") pod \"auto-csr-approver-29563992-gf76r\" (UID: \"ba393e3a-e4e4-4432-a5fe-08e5592f7d06\") " pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.377030 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhf6\" (UniqueName: \"kubernetes.io/projected/ba393e3a-e4e4-4432-a5fe-08e5592f7d06-kube-api-access-hlhf6\") pod \"auto-csr-approver-29563992-gf76r\" (UID: \"ba393e3a-e4e4-4432-a5fe-08e5592f7d06\") " pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.395501 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhf6\" (UniqueName: \"kubernetes.io/projected/ba393e3a-e4e4-4432-a5fe-08e5592f7d06-kube-api-access-hlhf6\") pod \"auto-csr-approver-29563992-gf76r\" (UID: \"ba393e3a-e4e4-4432-a5fe-08e5592f7d06\") " pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.476178 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:00 crc kubenswrapper[4921]: I0318 13:12:00.878189 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-gf76r"] Mar 18 13:12:01 crc kubenswrapper[4921]: I0318 13:12:01.641477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-gf76r" event={"ID":"ba393e3a-e4e4-4432-a5fe-08e5592f7d06","Type":"ContainerStarted","Data":"23f1cbb2c39269aaddb8c8fb9f330f8b61f3668decbda972dc0c51b18fe4a1fa"} Mar 18 13:12:02 crc kubenswrapper[4921]: I0318 13:12:02.650702 4921 generic.go:334] "Generic (PLEG): container finished" podID="ba393e3a-e4e4-4432-a5fe-08e5592f7d06" containerID="fa2b775749bf51918b806251daec0ce33eaec0f8a1dbc41761ecf7bdde442ebe" exitCode=0 Mar 18 13:12:02 crc kubenswrapper[4921]: I0318 13:12:02.650869 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-gf76r" event={"ID":"ba393e3a-e4e4-4432-a5fe-08e5592f7d06","Type":"ContainerDied","Data":"fa2b775749bf51918b806251daec0ce33eaec0f8a1dbc41761ecf7bdde442ebe"} Mar 18 13:12:03 crc kubenswrapper[4921]: I0318 13:12:03.919428 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:03 crc kubenswrapper[4921]: I0318 13:12:03.932437 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlhf6\" (UniqueName: \"kubernetes.io/projected/ba393e3a-e4e4-4432-a5fe-08e5592f7d06-kube-api-access-hlhf6\") pod \"ba393e3a-e4e4-4432-a5fe-08e5592f7d06\" (UID: \"ba393e3a-e4e4-4432-a5fe-08e5592f7d06\") " Mar 18 13:12:03 crc kubenswrapper[4921]: I0318 13:12:03.938459 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba393e3a-e4e4-4432-a5fe-08e5592f7d06-kube-api-access-hlhf6" (OuterVolumeSpecName: "kube-api-access-hlhf6") pod "ba393e3a-e4e4-4432-a5fe-08e5592f7d06" (UID: "ba393e3a-e4e4-4432-a5fe-08e5592f7d06"). InnerVolumeSpecName "kube-api-access-hlhf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:12:04 crc kubenswrapper[4921]: I0318 13:12:04.033831 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlhf6\" (UniqueName: \"kubernetes.io/projected/ba393e3a-e4e4-4432-a5fe-08e5592f7d06-kube-api-access-hlhf6\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:04 crc kubenswrapper[4921]: I0318 13:12:04.669712 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-gf76r" event={"ID":"ba393e3a-e4e4-4432-a5fe-08e5592f7d06","Type":"ContainerDied","Data":"23f1cbb2c39269aaddb8c8fb9f330f8b61f3668decbda972dc0c51b18fe4a1fa"} Mar 18 13:12:04 crc kubenswrapper[4921]: I0318 13:12:04.670306 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f1cbb2c39269aaddb8c8fb9f330f8b61f3668decbda972dc0c51b18fe4a1fa" Mar 18 13:12:04 crc kubenswrapper[4921]: I0318 13:12:04.669759 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-gf76r" Mar 18 13:12:04 crc kubenswrapper[4921]: I0318 13:12:04.990702 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-84lsj"] Mar 18 13:12:04 crc kubenswrapper[4921]: I0318 13:12:04.997426 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-84lsj"] Mar 18 13:12:05 crc kubenswrapper[4921]: I0318 13:12:05.218442 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0714299b-200c-4333-ad4b-b925ebeac926" path="/var/lib/kubelet/pods/0714299b-200c-4333-ad4b-b925ebeac926/volumes" Mar 18 13:12:06 crc kubenswrapper[4921]: I0318 13:12:06.208554 4921 scope.go:117] "RemoveContainer" containerID="331b5f285a5d86be19fd494a86afd73ab22c3b289871fa8852240cfad5a60fb5" Mar 18 13:12:08 crc kubenswrapper[4921]: I0318 13:12:08.209705 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:12:08 crc kubenswrapper[4921]: E0318 13:12:08.210404 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:12:22 crc kubenswrapper[4921]: I0318 13:12:22.208743 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:12:22 crc kubenswrapper[4921]: E0318 13:12:22.209195 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:12:37 crc kubenswrapper[4921]: I0318 13:12:37.209527 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:12:37 crc kubenswrapper[4921]: E0318 13:12:37.210255 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:12:51 crc kubenswrapper[4921]: I0318 13:12:51.214360 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:12:51 crc kubenswrapper[4921]: E0318 13:12:51.215264 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:13:02 crc kubenswrapper[4921]: I0318 13:13:02.209346 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:13:02 crc kubenswrapper[4921]: E0318 13:13:02.210091 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:13:13 crc kubenswrapper[4921]: I0318 13:13:13.209417 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:13:13 crc kubenswrapper[4921]: E0318 13:13:13.210059 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:13:28 crc kubenswrapper[4921]: I0318 13:13:28.208778 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:13:28 crc kubenswrapper[4921]: E0318 13:13:28.209569 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:13:41 crc kubenswrapper[4921]: I0318 13:13:41.216094 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:13:41 crc kubenswrapper[4921]: E0318 13:13:41.216885 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:13:56 crc kubenswrapper[4921]: I0318 13:13:56.210328 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:13:56 crc kubenswrapper[4921]: E0318 13:13:56.211553 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.147739 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563994-72dpw"] Mar 18 13:14:00 crc kubenswrapper[4921]: E0318 13:14:00.148455 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba393e3a-e4e4-4432-a5fe-08e5592f7d06" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.148473 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba393e3a-e4e4-4432-a5fe-08e5592f7d06" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.148682 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba393e3a-e4e4-4432-a5fe-08e5592f7d06" containerName="oc" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.149373 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.152640 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.153522 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.155527 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.162251 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-72dpw"] Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.230525 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmj6\" (UniqueName: \"kubernetes.io/projected/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8-kube-api-access-4xmj6\") pod \"auto-csr-approver-29563994-72dpw\" (UID: \"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8\") " pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.332210 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmj6\" (UniqueName: \"kubernetes.io/projected/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8-kube-api-access-4xmj6\") pod \"auto-csr-approver-29563994-72dpw\" (UID: \"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8\") " pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.352400 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmj6\" (UniqueName: \"kubernetes.io/projected/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8-kube-api-access-4xmj6\") pod \"auto-csr-approver-29563994-72dpw\" (UID: \"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8\") " pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.470409 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:00 crc kubenswrapper[4921]: I0318 13:14:00.879764 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-72dpw"] Mar 18 13:14:01 crc kubenswrapper[4921]: I0318 13:14:01.477226 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-72dpw" event={"ID":"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8","Type":"ContainerStarted","Data":"96597b57b2b27712af2677037a3a9d16d0831a3e3ae414326d398fcadcd13cd7"} Mar 18 13:14:03 crc kubenswrapper[4921]: I0318 13:14:03.494596 4921 generic.go:334] "Generic (PLEG): container finished" podID="9c3d261c-7188-4f22-9d2a-f1fb367b2eb8" containerID="db9f9286d25a3a2b593147005dd815eca4f69ff826a24b4333c4029011044d12" exitCode=0 Mar 18 13:14:03 crc kubenswrapper[4921]: I0318 13:14:03.494742 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-72dpw" event={"ID":"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8","Type":"ContainerDied","Data":"db9f9286d25a3a2b593147005dd815eca4f69ff826a24b4333c4029011044d12"} Mar 18 13:14:04 crc kubenswrapper[4921]: I0318 13:14:04.767324 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:04 crc kubenswrapper[4921]: I0318 13:14:04.894518 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xmj6\" (UniqueName: \"kubernetes.io/projected/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8-kube-api-access-4xmj6\") pod \"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8\" (UID: \"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8\") " Mar 18 13:14:04 crc kubenswrapper[4921]: I0318 13:14:04.900083 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8-kube-api-access-4xmj6" (OuterVolumeSpecName: "kube-api-access-4xmj6") pod "9c3d261c-7188-4f22-9d2a-f1fb367b2eb8" (UID: "9c3d261c-7188-4f22-9d2a-f1fb367b2eb8"). InnerVolumeSpecName "kube-api-access-4xmj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:14:04 crc kubenswrapper[4921]: I0318 13:14:04.995806 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xmj6\" (UniqueName: \"kubernetes.io/projected/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8-kube-api-access-4xmj6\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:05 crc kubenswrapper[4921]: I0318 13:14:05.511600 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-72dpw" event={"ID":"9c3d261c-7188-4f22-9d2a-f1fb367b2eb8","Type":"ContainerDied","Data":"96597b57b2b27712af2677037a3a9d16d0831a3e3ae414326d398fcadcd13cd7"} Mar 18 13:14:05 crc kubenswrapper[4921]: I0318 13:14:05.511664 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96597b57b2b27712af2677037a3a9d16d0831a3e3ae414326d398fcadcd13cd7" Mar 18 13:14:05 crc kubenswrapper[4921]: I0318 13:14:05.511732 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-72dpw" Mar 18 13:14:05 crc kubenswrapper[4921]: I0318 13:14:05.839301 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-shx76"] Mar 18 13:14:05 crc kubenswrapper[4921]: I0318 13:14:05.844883 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-shx76"] Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.186592 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l2br7"] Mar 18 13:14:07 crc kubenswrapper[4921]: E0318 13:14:07.186980 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3d261c-7188-4f22-9d2a-f1fb367b2eb8" containerName="oc" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.186999 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3d261c-7188-4f22-9d2a-f1fb367b2eb8" containerName="oc" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.187213 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3d261c-7188-4f22-9d2a-f1fb367b2eb8" containerName="oc" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.188294 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.196713 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2br7"] Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.210203 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:14:07 crc kubenswrapper[4921]: E0318 13:14:07.210437 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.218729 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa" path="/var/lib/kubelet/pods/a98c2ec4-95d5-41bf-8d54-ed1fc37f49fa/volumes" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.332274 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-catalog-content\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.332336 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrgj6\" (UniqueName: \"kubernetes.io/projected/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-kube-api-access-rrgj6\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.332408 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-utilities\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.433879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-catalog-content\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.433928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrgj6\" (UniqueName: \"kubernetes.io/projected/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-kube-api-access-rrgj6\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.433950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-utilities\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.434516 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-catalog-content\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.434554 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-utilities\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.454510 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrgj6\" (UniqueName: \"kubernetes.io/projected/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-kube-api-access-rrgj6\") pod \"redhat-operators-l2br7\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.509535 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:07 crc kubenswrapper[4921]: I0318 13:14:07.983590 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2br7"] Mar 18 13:14:08 crc kubenswrapper[4921]: I0318 13:14:08.532909 4921 generic.go:334] "Generic (PLEG): container finished" podID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerID="15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92" exitCode=0 Mar 18 13:14:08 crc kubenswrapper[4921]: I0318 13:14:08.533007 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2br7" event={"ID":"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8","Type":"ContainerDied","Data":"15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92"} Mar 18 13:14:08 crc kubenswrapper[4921]: I0318 13:14:08.533262 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2br7" event={"ID":"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8","Type":"ContainerStarted","Data":"d84095e289346d20a611d10aef1a701755de01aadf1a2f741771fbaa2cfb5bf3"} Mar 18 13:14:10 crc kubenswrapper[4921]: I0318 13:14:10.548847 4921 generic.go:334] "Generic (PLEG): container finished" podID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerID="a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4" exitCode=0 Mar 18 13:14:10 crc kubenswrapper[4921]: I0318 13:14:10.548983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2br7" event={"ID":"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8","Type":"ContainerDied","Data":"a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4"} Mar 18 13:14:11 crc kubenswrapper[4921]: I0318 13:14:11.557585 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2br7" event={"ID":"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8","Type":"ContainerStarted","Data":"f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a"} Mar 18 13:14:11 crc kubenswrapper[4921]: I0318 13:14:11.575393 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l2br7" podStartSLOduration=2.026722812 podStartE2EDuration="4.575368793s" podCreationTimestamp="2026-03-18 13:14:07 +0000 UTC" firstStartedPulling="2026-03-18 13:14:08.535082692 +0000 UTC m=+3868.085003331" lastFinishedPulling="2026-03-18 13:14:11.083728683 +0000 UTC m=+3870.633649312" observedRunningTime="2026-03-18 13:14:11.574058775 +0000 UTC m=+3871.123979414" watchObservedRunningTime="2026-03-18 13:14:11.575368793 +0000 UTC m=+3871.125289452" Mar 18 13:14:17 crc kubenswrapper[4921]: I0318 13:14:17.509926 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:17 crc kubenswrapper[4921]: I0318 13:14:17.510459 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:18 crc kubenswrapper[4921]: I0318 13:14:18.209775 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:14:18 crc kubenswrapper[4921]: E0318 13:14:18.209997 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:14:18 crc kubenswrapper[4921]: I0318 13:14:18.556873 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2br7" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="registry-server" probeResult="failure" output=< Mar 18 13:14:18 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 13:14:18 crc kubenswrapper[4921]: > Mar 18 13:14:27 crc kubenswrapper[4921]: I0318 13:14:27.563662 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:27 crc kubenswrapper[4921]: I0318 13:14:27.620848 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:27 crc kubenswrapper[4921]: I0318 13:14:27.798933 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2br7"] Mar 18 13:14:28 crc kubenswrapper[4921]: I0318 13:14:28.669290 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l2br7" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="registry-server" containerID="cri-o://f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a" gracePeriod=2 Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.125567 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.265297 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-utilities\") pod \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.265399 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-catalog-content\") pod \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.265454 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrgj6\" (UniqueName: \"kubernetes.io/projected/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-kube-api-access-rrgj6\") pod \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\" (UID: \"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8\") " Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.266355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-utilities" (OuterVolumeSpecName: "utilities") pod "402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" (UID: "402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.271713 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-kube-api-access-rrgj6" (OuterVolumeSpecName: "kube-api-access-rrgj6") pod "402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" (UID: "402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8"). InnerVolumeSpecName "kube-api-access-rrgj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.366688 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.366717 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrgj6\" (UniqueName: \"kubernetes.io/projected/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-kube-api-access-rrgj6\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.420357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" (UID: "402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.468561 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.678816 4921 generic.go:334] "Generic (PLEG): container finished" podID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerID="f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a" exitCode=0 Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.678873 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2br7" event={"ID":"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8","Type":"ContainerDied","Data":"f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a"} Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.678896 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2br7" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.678917 4921 scope.go:117] "RemoveContainer" containerID="f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.678905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2br7" event={"ID":"402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8","Type":"ContainerDied","Data":"d84095e289346d20a611d10aef1a701755de01aadf1a2f741771fbaa2cfb5bf3"} Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.711449 4921 scope.go:117] "RemoveContainer" containerID="a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.713583 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2br7"] Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.721093 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l2br7"] Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.733754 4921 scope.go:117] "RemoveContainer" containerID="15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.758513 4921 scope.go:117] "RemoveContainer" containerID="f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a" Mar 18 13:14:29 crc kubenswrapper[4921]: E0318 13:14:29.759034 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a\": container with ID starting with f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a not found: ID does not exist" containerID="f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.759100 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a"} err="failed to get container status \"f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a\": rpc error: code = NotFound desc = could not find container \"f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a\": container with ID starting with f4785314dec2702819e32443b17e4112286365401abc2dcdf2fc42a0685eaa8a not found: ID does not exist" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.759154 4921 scope.go:117] "RemoveContainer" containerID="a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4" Mar 18 13:14:29 crc kubenswrapper[4921]: E0318 13:14:29.759508 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4\": container with ID starting with a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4 not found: ID does not exist" containerID="a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.759536 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4"} err="failed to get container status \"a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4\": rpc error: code = NotFound desc = could not find container \"a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4\": container with ID starting with a433f5744124402a824e25c3b4f376a6d808c865cda35019fa242e1693c0aae4 not found: ID does not exist" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.759551 4921 scope.go:117] "RemoveContainer" containerID="15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92" Mar 18 13:14:29 crc kubenswrapper[4921]: E0318 13:14:29.759830 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92\": container with ID starting with 15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92 not found: ID does not exist" containerID="15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92" Mar 18 13:14:29 crc kubenswrapper[4921]: I0318 13:14:29.759871 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92"} err="failed to get container status \"15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92\": rpc error: code = NotFound desc = could not find container \"15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92\": container with ID starting with 15d8cf5134d0937f447609df54f04a05551e1105e14eacc8d85754ddf3c78b92 not found: ID does not exist" Mar 18 13:14:31 crc kubenswrapper[4921]: I0318 13:14:31.220870 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" path="/var/lib/kubelet/pods/402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8/volumes" Mar 18 13:14:33 crc kubenswrapper[4921]: I0318 13:14:33.209411 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:14:33 crc kubenswrapper[4921]: E0318 13:14:33.209908 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:14:45 crc kubenswrapper[4921]: I0318 13:14:45.209232 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:14:45 crc kubenswrapper[4921]: E0318 13:14:45.210081 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:14:58 crc kubenswrapper[4921]: I0318 13:14:58.209212 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:14:58 crc kubenswrapper[4921]: I0318 13:14:58.891374 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"fd6758da39713f31e7f5a1a978c21ad8c192c00e9ab6033de9d0030c01727af8"} Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.161277 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr"] Mar 18 13:15:00 crc kubenswrapper[4921]: E0318 13:15:00.161888 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="registry-server" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.161900 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="registry-server" Mar 18 13:15:00 crc kubenswrapper[4921]: E0318 13:15:00.161911 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="extract-content" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.161917 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="extract-content" Mar 18 13:15:00 crc kubenswrapper[4921]: E0318 13:15:00.161935 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="extract-utilities" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.161942 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="extract-utilities" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.162085 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="402c46fc-e2d0-4cc7-81bc-ffdf7d4943d8" containerName="registry-server" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.162573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.165000 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.165008 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.181360 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr"] Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.228348 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dww\" (UniqueName: \"kubernetes.io/projected/c1186373-3831-4f51-b591-a01e74e1f199-kube-api-access-42dww\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.228439 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1186373-3831-4f51-b591-a01e74e1f199-config-volume\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.228462 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1186373-3831-4f51-b591-a01e74e1f199-secret-volume\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.329725 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42dww\" (UniqueName: \"kubernetes.io/projected/c1186373-3831-4f51-b591-a01e74e1f199-kube-api-access-42dww\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.329858 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1186373-3831-4f51-b591-a01e74e1f199-config-volume\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.329888 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1186373-3831-4f51-b591-a01e74e1f199-secret-volume\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.331485 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1186373-3831-4f51-b591-a01e74e1f199-config-volume\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.337899 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1186373-3831-4f51-b591-a01e74e1f199-secret-volume\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.349406 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dww\" (UniqueName: \"kubernetes.io/projected/c1186373-3831-4f51-b591-a01e74e1f199-kube-api-access-42dww\") pod \"collect-profiles-29563995-x4whr\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.486484 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:00 crc kubenswrapper[4921]: I0318 13:15:00.935705 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr"] Mar 18 13:15:00 crc kubenswrapper[4921]: W0318 13:15:00.940767 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1186373_3831_4f51_b591_a01e74e1f199.slice/crio-262b9be9b2ee337268aeff3abfbae2a042e758a3ea0e9ebf24e7119bb38c56a0 WatchSource:0}: Error finding container 262b9be9b2ee337268aeff3abfbae2a042e758a3ea0e9ebf24e7119bb38c56a0: Status 404 returned error can't find the container with id 262b9be9b2ee337268aeff3abfbae2a042e758a3ea0e9ebf24e7119bb38c56a0 Mar 18 13:15:01 crc kubenswrapper[4921]: I0318 13:15:01.921621 4921 generic.go:334] "Generic (PLEG): container finished" podID="c1186373-3831-4f51-b591-a01e74e1f199" containerID="95d5594c7ab4a13802663d99a28123761e057657485e4147ad54b4ebe91a0b6d" exitCode=0 Mar 18 13:15:01 crc kubenswrapper[4921]: I0318 13:15:01.921996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" event={"ID":"c1186373-3831-4f51-b591-a01e74e1f199","Type":"ContainerDied","Data":"95d5594c7ab4a13802663d99a28123761e057657485e4147ad54b4ebe91a0b6d"} Mar 18 13:15:01 crc kubenswrapper[4921]: I0318 13:15:01.922046 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" event={"ID":"c1186373-3831-4f51-b591-a01e74e1f199","Type":"ContainerStarted","Data":"262b9be9b2ee337268aeff3abfbae2a042e758a3ea0e9ebf24e7119bb38c56a0"} Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.324962 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.386484 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1186373-3831-4f51-b591-a01e74e1f199-config-volume\") pod \"c1186373-3831-4f51-b591-a01e74e1f199\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.386589 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42dww\" (UniqueName: \"kubernetes.io/projected/c1186373-3831-4f51-b591-a01e74e1f199-kube-api-access-42dww\") pod \"c1186373-3831-4f51-b591-a01e74e1f199\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.386617 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1186373-3831-4f51-b591-a01e74e1f199-secret-volume\") pod \"c1186373-3831-4f51-b591-a01e74e1f199\" (UID: \"c1186373-3831-4f51-b591-a01e74e1f199\") " Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.387464 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1186373-3831-4f51-b591-a01e74e1f199-config-volume" (OuterVolumeSpecName: "config-volume") pod "c1186373-3831-4f51-b591-a01e74e1f199" (UID: "c1186373-3831-4f51-b591-a01e74e1f199"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.394849 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1186373-3831-4f51-b591-a01e74e1f199-kube-api-access-42dww" (OuterVolumeSpecName: "kube-api-access-42dww") pod "c1186373-3831-4f51-b591-a01e74e1f199" (UID: "c1186373-3831-4f51-b591-a01e74e1f199"). InnerVolumeSpecName "kube-api-access-42dww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.395253 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1186373-3831-4f51-b591-a01e74e1f199-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c1186373-3831-4f51-b591-a01e74e1f199" (UID: "c1186373-3831-4f51-b591-a01e74e1f199"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.488484 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1186373-3831-4f51-b591-a01e74e1f199-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.489288 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42dww\" (UniqueName: \"kubernetes.io/projected/c1186373-3831-4f51-b591-a01e74e1f199-kube-api-access-42dww\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.490280 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1186373-3831-4f51-b591-a01e74e1f199-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.941546 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" event={"ID":"c1186373-3831-4f51-b591-a01e74e1f199","Type":"ContainerDied","Data":"262b9be9b2ee337268aeff3abfbae2a042e758a3ea0e9ebf24e7119bb38c56a0"} Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.942204 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262b9be9b2ee337268aeff3abfbae2a042e758a3ea0e9ebf24e7119bb38c56a0" Mar 18 13:15:03 crc kubenswrapper[4921]: I0318 13:15:03.942297 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr" Mar 18 13:15:04 crc kubenswrapper[4921]: I0318 13:15:04.414516 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx"] Mar 18 13:15:04 crc kubenswrapper[4921]: I0318 13:15:04.420554 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-9v6cx"] Mar 18 13:15:05 crc kubenswrapper[4921]: I0318 13:15:05.219380 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6d5c48-5eeb-4cbb-bdba-c70202696ba6" path="/var/lib/kubelet/pods/de6d5c48-5eeb-4cbb-bdba-c70202696ba6/volumes" Mar 18 13:15:06 crc kubenswrapper[4921]: I0318 13:15:06.305255 4921 scope.go:117] "RemoveContainer" containerID="25252fffa055349439fbb46c753e0ac45be0c99163d5eafb3cba190386243906" Mar 18 13:15:06 crc kubenswrapper[4921]: I0318 13:15:06.403227 4921 scope.go:117] "RemoveContainer" containerID="516df1388f4566cb03c09777d3a0acdb98f6a4a13e36916599314b90c1d83912" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.148474 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563996-fdng5"] Mar 18 13:16:00 crc kubenswrapper[4921]: E0318 13:16:00.150431 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1186373-3831-4f51-b591-a01e74e1f199" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.150451 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1186373-3831-4f51-b591-a01e74e1f199" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.150592 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1186373-3831-4f51-b591-a01e74e1f199" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.151078 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.153301 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.153427 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.153719 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.159880 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-fdng5"] Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.227490 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbcdj\" (UniqueName: \"kubernetes.io/projected/fab1ca47-1577-4327-8e85-6cbf07168495-kube-api-access-nbcdj\") pod \"auto-csr-approver-29563996-fdng5\" (UID: \"fab1ca47-1577-4327-8e85-6cbf07168495\") " pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.328878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbcdj\" (UniqueName: \"kubernetes.io/projected/fab1ca47-1577-4327-8e85-6cbf07168495-kube-api-access-nbcdj\") pod \"auto-csr-approver-29563996-fdng5\" (UID: \"fab1ca47-1577-4327-8e85-6cbf07168495\") " pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.349185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbcdj\" (UniqueName: \"kubernetes.io/projected/fab1ca47-1577-4327-8e85-6cbf07168495-kube-api-access-nbcdj\") pod \"auto-csr-approver-29563996-fdng5\" (UID: \"fab1ca47-1577-4327-8e85-6cbf07168495\") " pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.470040 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.872354 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-fdng5"] Mar 18 13:16:00 crc kubenswrapper[4921]: I0318 13:16:00.880179 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:16:01 crc kubenswrapper[4921]: I0318 13:16:01.334605 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-fdng5" event={"ID":"fab1ca47-1577-4327-8e85-6cbf07168495","Type":"ContainerStarted","Data":"591f4693be3f1ad9a862600f8059f8175c0b287dc5c3267f4efeb1b73d7a78d1"} Mar 18 13:16:11 crc kubenswrapper[4921]: I0318 13:16:11.478926 4921 generic.go:334] "Generic (PLEG): container finished" podID="fab1ca47-1577-4327-8e85-6cbf07168495" containerID="c401e9e1bbb20b82d17325861b51dfba697f6028dd04fb9f265a09982e0b5bf2" exitCode=0 Mar 18 13:16:11 crc kubenswrapper[4921]: I0318 13:16:11.479021 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-fdng5" event={"ID":"fab1ca47-1577-4327-8e85-6cbf07168495","Type":"ContainerDied","Data":"c401e9e1bbb20b82d17325861b51dfba697f6028dd04fb9f265a09982e0b5bf2"} Mar 18 13:16:12 crc kubenswrapper[4921]: I0318 13:16:12.769171 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:12 crc kubenswrapper[4921]: I0318 13:16:12.918410 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbcdj\" (UniqueName: \"kubernetes.io/projected/fab1ca47-1577-4327-8e85-6cbf07168495-kube-api-access-nbcdj\") pod \"fab1ca47-1577-4327-8e85-6cbf07168495\" (UID: \"fab1ca47-1577-4327-8e85-6cbf07168495\") " Mar 18 13:16:12 crc kubenswrapper[4921]: I0318 13:16:12.924989 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab1ca47-1577-4327-8e85-6cbf07168495-kube-api-access-nbcdj" (OuterVolumeSpecName: "kube-api-access-nbcdj") pod "fab1ca47-1577-4327-8e85-6cbf07168495" (UID: "fab1ca47-1577-4327-8e85-6cbf07168495"). InnerVolumeSpecName "kube-api-access-nbcdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:13 crc kubenswrapper[4921]: I0318 13:16:13.020477 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbcdj\" (UniqueName: \"kubernetes.io/projected/fab1ca47-1577-4327-8e85-6cbf07168495-kube-api-access-nbcdj\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:13 crc kubenswrapper[4921]: I0318 13:16:13.494122 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-fdng5" event={"ID":"fab1ca47-1577-4327-8e85-6cbf07168495","Type":"ContainerDied","Data":"591f4693be3f1ad9a862600f8059f8175c0b287dc5c3267f4efeb1b73d7a78d1"} Mar 18 13:16:13 crc kubenswrapper[4921]: I0318 13:16:13.494171 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="591f4693be3f1ad9a862600f8059f8175c0b287dc5c3267f4efeb1b73d7a78d1" Mar 18 13:16:13 crc kubenswrapper[4921]: I0318 13:16:13.494195 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-fdng5" Mar 18 13:16:13 crc kubenswrapper[4921]: I0318 13:16:13.838095 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-sdgcp"] Mar 18 13:16:13 crc kubenswrapper[4921]: I0318 13:16:13.844171 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-sdgcp"] Mar 18 13:16:15 crc kubenswrapper[4921]: I0318 13:16:15.224657 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf8163f-8c37-4da7-900a-edad654804a5" path="/var/lib/kubelet/pods/bbf8163f-8c37-4da7-900a-edad654804a5/volumes" Mar 18 13:17:06 crc kubenswrapper[4921]: I0318 13:17:06.490557 4921 scope.go:117] "RemoveContainer" containerID="c14d25496252783b3e83edb9e4d7cbfecbd5d74635a6ed65948a20db2e2b750a" Mar 18 13:17:17 crc kubenswrapper[4921]: I0318 13:17:17.080876 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:17:17 crc kubenswrapper[4921]: I0318 13:17:17.082249 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:17:47 crc kubenswrapper[4921]: I0318 13:17:47.081523 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:17:47 crc kubenswrapper[4921]: I0318 13:17:47.082579 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.151655 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563998-npd9w"] Mar 18 13:18:00 crc kubenswrapper[4921]: E0318 13:18:00.152629 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab1ca47-1577-4327-8e85-6cbf07168495" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.152647 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab1ca47-1577-4327-8e85-6cbf07168495" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.152808 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab1ca47-1577-4327-8e85-6cbf07168495" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.153414 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.155877 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.155935 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.156310 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.171205 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-npd9w"] Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.276419 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7blp\" (UniqueName: \"kubernetes.io/projected/57a6cde1-043f-420d-8653-f624145df1b2-kube-api-access-r7blp\") pod \"auto-csr-approver-29563998-npd9w\" (UID: \"57a6cde1-043f-420d-8653-f624145df1b2\") " pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.378994 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7blp\" (UniqueName: \"kubernetes.io/projected/57a6cde1-043f-420d-8653-f624145df1b2-kube-api-access-r7blp\") pod \"auto-csr-approver-29563998-npd9w\" (UID: \"57a6cde1-043f-420d-8653-f624145df1b2\") " pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.399463 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7blp\" (UniqueName: \"kubernetes.io/projected/57a6cde1-043f-420d-8653-f624145df1b2-kube-api-access-r7blp\") pod \"auto-csr-approver-29563998-npd9w\" (UID: \"57a6cde1-043f-420d-8653-f624145df1b2\") " pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:00 crc kubenswrapper[4921]: I0318 13:18:00.476276 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:01 crc kubenswrapper[4921]: I0318 13:18:00.776630 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-npd9w"] Mar 18 13:18:01 crc kubenswrapper[4921]: I0318 13:18:01.261815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-npd9w" event={"ID":"57a6cde1-043f-420d-8653-f624145df1b2","Type":"ContainerStarted","Data":"a3ed86e7d17621ffc5cf676dcfe3317b323a282707f74775f9b74cf9957d7987"} Mar 18 13:18:02 crc kubenswrapper[4921]: I0318 13:18:02.271182 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-npd9w" event={"ID":"57a6cde1-043f-420d-8653-f624145df1b2","Type":"ContainerStarted","Data":"f7df322ca43f842d0ecec0f104fce020eb2962957933461a7ed4c504826bc12e"} Mar 18 13:18:02 crc kubenswrapper[4921]: I0318 13:18:02.287087 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563998-npd9w" podStartSLOduration=1.199807451 podStartE2EDuration="2.287064974s" podCreationTimestamp="2026-03-18 13:18:00 +0000 UTC" firstStartedPulling="2026-03-18 13:18:00.791082075 +0000 UTC m=+4100.341002714" lastFinishedPulling="2026-03-18 13:18:01.878339598 +0000 UTC m=+4101.428260237" observedRunningTime="2026-03-18 13:18:02.285845999 +0000 UTC m=+4101.835766648" watchObservedRunningTime="2026-03-18 13:18:02.287064974 +0000 UTC m=+4101.836985613" Mar 18 13:18:03 crc kubenswrapper[4921]: I0318 13:18:03.283489 4921 generic.go:334] "Generic (PLEG): container finished" podID="57a6cde1-043f-420d-8653-f624145df1b2" containerID="f7df322ca43f842d0ecec0f104fce020eb2962957933461a7ed4c504826bc12e" exitCode=0 Mar 18 13:18:03 crc kubenswrapper[4921]: I0318 13:18:03.284103 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-npd9w" event={"ID":"57a6cde1-043f-420d-8653-f624145df1b2","Type":"ContainerDied","Data":"f7df322ca43f842d0ecec0f104fce020eb2962957933461a7ed4c504826bc12e"} Mar 18 13:18:04 crc kubenswrapper[4921]: I0318 13:18:04.527355 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:04 crc kubenswrapper[4921]: I0318 13:18:04.638277 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7blp\" (UniqueName: \"kubernetes.io/projected/57a6cde1-043f-420d-8653-f624145df1b2-kube-api-access-r7blp\") pod \"57a6cde1-043f-420d-8653-f624145df1b2\" (UID: \"57a6cde1-043f-420d-8653-f624145df1b2\") " Mar 18 13:18:04 crc kubenswrapper[4921]: I0318 13:18:04.646846 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a6cde1-043f-420d-8653-f624145df1b2-kube-api-access-r7blp" (OuterVolumeSpecName: "kube-api-access-r7blp") pod "57a6cde1-043f-420d-8653-f624145df1b2" (UID: "57a6cde1-043f-420d-8653-f624145df1b2"). InnerVolumeSpecName "kube-api-access-r7blp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:04 crc kubenswrapper[4921]: I0318 13:18:04.740739 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7blp\" (UniqueName: \"kubernetes.io/projected/57a6cde1-043f-420d-8653-f624145df1b2-kube-api-access-r7blp\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:05 crc kubenswrapper[4921]: I0318 13:18:05.298927 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-npd9w" event={"ID":"57a6cde1-043f-420d-8653-f624145df1b2","Type":"ContainerDied","Data":"a3ed86e7d17621ffc5cf676dcfe3317b323a282707f74775f9b74cf9957d7987"} Mar 18 13:18:05 crc kubenswrapper[4921]: I0318 13:18:05.298978 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ed86e7d17621ffc5cf676dcfe3317b323a282707f74775f9b74cf9957d7987" Mar 18 13:18:05 crc kubenswrapper[4921]: I0318 13:18:05.299037 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-npd9w" Mar 18 13:18:05 crc kubenswrapper[4921]: I0318 13:18:05.349816 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-gf76r"] Mar 18 13:18:05 crc kubenswrapper[4921]: I0318 13:18:05.355367 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-gf76r"] Mar 18 13:18:07 crc kubenswrapper[4921]: I0318 13:18:07.217223 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba393e3a-e4e4-4432-a5fe-08e5592f7d06" path="/var/lib/kubelet/pods/ba393e3a-e4e4-4432-a5fe-08e5592f7d06/volumes" Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.080805 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.081513 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.081571 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.082313 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd6758da39713f31e7f5a1a978c21ad8c192c00e9ab6033de9d0030c01727af8"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.082382 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://fd6758da39713f31e7f5a1a978c21ad8c192c00e9ab6033de9d0030c01727af8" gracePeriod=600 Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.384331 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="fd6758da39713f31e7f5a1a978c21ad8c192c00e9ab6033de9d0030c01727af8" exitCode=0 Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.384547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"fd6758da39713f31e7f5a1a978c21ad8c192c00e9ab6033de9d0030c01727af8"} Mar 18 13:18:17 crc kubenswrapper[4921]: I0318 13:18:17.384614 4921 scope.go:117] "RemoveContainer" containerID="461aa87e3d0b83479c7855d0a361787e5026adbe4855af894dea91098511c691" Mar 18 13:18:18 crc kubenswrapper[4921]: I0318 13:18:18.393747 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d"} Mar 18 13:19:06 crc kubenswrapper[4921]: I0318 13:19:06.571390 4921 scope.go:117] "RemoveContainer" containerID="fa2b775749bf51918b806251daec0ce33eaec0f8a1dbc41761ecf7bdde442ebe" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.144812 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564000-6zz8n"] Mar 18 13:20:00 crc kubenswrapper[4921]: E0318 13:20:00.145813 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a6cde1-043f-420d-8653-f624145df1b2" containerName="oc" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.145832 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a6cde1-043f-420d-8653-f624145df1b2" containerName="oc" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.146023 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a6cde1-043f-420d-8653-f624145df1b2" containerName="oc" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.151972 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.153441 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-6zz8n"] Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.154780 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.154839 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.155204 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.195153 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfgp\" (UniqueName: \"kubernetes.io/projected/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a-kube-api-access-fwfgp\") pod \"auto-csr-approver-29564000-6zz8n\" (UID: \"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a\") " pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.296103 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfgp\" (UniqueName: \"kubernetes.io/projected/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a-kube-api-access-fwfgp\") pod \"auto-csr-approver-29564000-6zz8n\" (UID: \"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a\") " pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.319218 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfgp\" (UniqueName: \"kubernetes.io/projected/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a-kube-api-access-fwfgp\") pod \"auto-csr-approver-29564000-6zz8n\" (UID: \"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a\") " pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.484336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:00 crc kubenswrapper[4921]: I0318 13:20:00.931222 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-6zz8n"] Mar 18 13:20:01 crc kubenswrapper[4921]: I0318 13:20:01.389041 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" event={"ID":"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a","Type":"ContainerStarted","Data":"85a38130a17e33a8cf2e09bf4327ad6e26d0b124cb36f549995587cbf463cb3c"} Mar 18 13:20:03 crc kubenswrapper[4921]: I0318 13:20:03.407073 4921 generic.go:334] "Generic (PLEG): container finished" podID="9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a" containerID="b762e6bb38a3b639d806ea7c63e429447112243ec855a9630e3eecd7ddf697a2" exitCode=0 Mar 18 13:20:03 crc kubenswrapper[4921]: I0318 13:20:03.407209 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" event={"ID":"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a","Type":"ContainerDied","Data":"b762e6bb38a3b639d806ea7c63e429447112243ec855a9630e3eecd7ddf697a2"} Mar 18 13:20:04 crc kubenswrapper[4921]: I0318 13:20:04.696885 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:04 crc kubenswrapper[4921]: I0318 13:20:04.873154 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwfgp\" (UniqueName: \"kubernetes.io/projected/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a-kube-api-access-fwfgp\") pod \"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a\" (UID: \"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a\") " Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.091763 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a-kube-api-access-fwfgp" (OuterVolumeSpecName: "kube-api-access-fwfgp") pod "9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a" (UID: "9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a"). InnerVolumeSpecName "kube-api-access-fwfgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.177372 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwfgp\" (UniqueName: \"kubernetes.io/projected/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a-kube-api-access-fwfgp\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.423008 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" event={"ID":"9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a","Type":"ContainerDied","Data":"85a38130a17e33a8cf2e09bf4327ad6e26d0b124cb36f549995587cbf463cb3c"} Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.423301 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85a38130a17e33a8cf2e09bf4327ad6e26d0b124cb36f549995587cbf463cb3c" Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.423384 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-6zz8n" Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.800558 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-72dpw"] Mar 18 13:20:05 crc kubenswrapper[4921]: I0318 13:20:05.806616 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-72dpw"] Mar 18 13:20:07 crc kubenswrapper[4921]: I0318 13:20:07.217554 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3d261c-7188-4f22-9d2a-f1fb367b2eb8" path="/var/lib/kubelet/pods/9c3d261c-7188-4f22-9d2a-f1fb367b2eb8/volumes" Mar 18 13:20:17 crc kubenswrapper[4921]: I0318 13:20:17.081626 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:20:17 crc kubenswrapper[4921]: I0318 13:20:17.082302 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:20:47 crc kubenswrapper[4921]: I0318 13:20:47.080758 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:20:47 crc kubenswrapper[4921]: I0318 13:20:47.081316 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:21:06 crc kubenswrapper[4921]: I0318 13:21:06.661265 4921 scope.go:117] "RemoveContainer" containerID="db9f9286d25a3a2b593147005dd815eca4f69ff826a24b4333c4029011044d12" Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.081471 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.082747 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.082805 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.083500 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.083573 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" gracePeriod=600 Mar 18 13:21:17 crc kubenswrapper[4921]: E0318 13:21:17.207131 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.931049 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" exitCode=0 Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.931099 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d"} Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.931163 4921 scope.go:117] "RemoveContainer" containerID="fd6758da39713f31e7f5a1a978c21ad8c192c00e9ab6033de9d0030c01727af8" Mar 18 13:21:17 crc kubenswrapper[4921]: I0318 13:21:17.932341 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:21:17 crc kubenswrapper[4921]: E0318 13:21:17.933562 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:21:33 crc kubenswrapper[4921]: I0318 13:21:33.209642 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:21:33 crc kubenswrapper[4921]: E0318 13:21:33.210380 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:21:46 crc kubenswrapper[4921]: I0318 13:21:46.209468 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:21:46 crc kubenswrapper[4921]: E0318 13:21:46.210372 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.158540 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564002-lcsn8"] Mar 18 13:22:00 crc kubenswrapper[4921]: E0318 13:22:00.159611 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a" containerName="oc" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.159629 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a" containerName="oc" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.159810 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a" containerName="oc" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.160423 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.162468 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.166427 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.168858 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.174054 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-lcsn8"] Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.209141 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:22:00 crc kubenswrapper[4921]: E0318 13:22:00.209524 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.218646 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9p6r\" (UniqueName: \"kubernetes.io/projected/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba-kube-api-access-x9p6r\") pod \"auto-csr-approver-29564002-lcsn8\" (UID: \"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba\") " pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.319833 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9p6r\" (UniqueName: \"kubernetes.io/projected/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba-kube-api-access-x9p6r\") pod \"auto-csr-approver-29564002-lcsn8\" (UID: \"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba\") " pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.341231 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9p6r\" (UniqueName: \"kubernetes.io/projected/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba-kube-api-access-x9p6r\") pod \"auto-csr-approver-29564002-lcsn8\" (UID: \"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba\") " pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.482436 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.885281 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-lcsn8"] Mar 18 13:22:00 crc kubenswrapper[4921]: I0318 13:22:00.895367 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:22:01 crc kubenswrapper[4921]: I0318 13:22:01.204023 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" event={"ID":"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba","Type":"ContainerStarted","Data":"676830fa1f7cb190bd32122e08063b67df6c43c35b7cd368615b405aa63832be"} Mar 18 13:22:02 crc kubenswrapper[4921]: I0318 13:22:02.212164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" event={"ID":"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba","Type":"ContainerStarted","Data":"9d70895c43def28365a4044f0ad4b0a4eb26203c34ab13189a0d49cc8f8b82c9"} Mar 18 13:22:02 crc kubenswrapper[4921]: I0318 13:22:02.227347 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" podStartSLOduration=1.149562043 podStartE2EDuration="2.227329594s" podCreationTimestamp="2026-03-18 13:22:00 +0000 UTC" firstStartedPulling="2026-03-18 13:22:00.895094291 +0000 UTC m=+4340.445014930" lastFinishedPulling="2026-03-18 13:22:01.972861842 +0000 UTC m=+4341.522782481" observedRunningTime="2026-03-18 13:22:02.223605718 +0000 UTC m=+4341.773526347" watchObservedRunningTime="2026-03-18 13:22:02.227329594 +0000 UTC m=+4341.777250233" Mar 18 13:22:03 crc kubenswrapper[4921]: I0318 13:22:03.220090 4921 generic.go:334] "Generic (PLEG): container finished" podID="c8892ed1-7ce1-4a66-99bf-37a3a030d7ba" containerID="9d70895c43def28365a4044f0ad4b0a4eb26203c34ab13189a0d49cc8f8b82c9" exitCode=0 Mar 18 13:22:03 crc kubenswrapper[4921]: I0318 13:22:03.220185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" event={"ID":"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba","Type":"ContainerDied","Data":"9d70895c43def28365a4044f0ad4b0a4eb26203c34ab13189a0d49cc8f8b82c9"} Mar 18 13:22:04 crc kubenswrapper[4921]: I0318 13:22:04.487651 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:04 crc kubenswrapper[4921]: I0318 13:22:04.643019 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9p6r\" (UniqueName: \"kubernetes.io/projected/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba-kube-api-access-x9p6r\") pod \"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba\" (UID: \"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba\") " Mar 18 13:22:04 crc kubenswrapper[4921]: I0318 13:22:04.647958 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba-kube-api-access-x9p6r" (OuterVolumeSpecName: "kube-api-access-x9p6r") pod "c8892ed1-7ce1-4a66-99bf-37a3a030d7ba" (UID: "c8892ed1-7ce1-4a66-99bf-37a3a030d7ba"). InnerVolumeSpecName "kube-api-access-x9p6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:22:04 crc kubenswrapper[4921]: I0318 13:22:04.744692 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9p6r\" (UniqueName: \"kubernetes.io/projected/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba-kube-api-access-x9p6r\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:05 crc kubenswrapper[4921]: I0318 13:22:05.232397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" event={"ID":"c8892ed1-7ce1-4a66-99bf-37a3a030d7ba","Type":"ContainerDied","Data":"676830fa1f7cb190bd32122e08063b67df6c43c35b7cd368615b405aa63832be"} Mar 18 13:22:05 crc kubenswrapper[4921]: I0318 13:22:05.232433 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-lcsn8" Mar 18 13:22:05 crc kubenswrapper[4921]: I0318 13:22:05.232443 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676830fa1f7cb190bd32122e08063b67df6c43c35b7cd368615b405aa63832be" Mar 18 13:22:05 crc kubenswrapper[4921]: I0318 13:22:05.289515 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-fdng5"] Mar 18 13:22:05 crc kubenswrapper[4921]: I0318 13:22:05.294620 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-fdng5"] Mar 18 13:22:07 crc kubenswrapper[4921]: I0318 13:22:07.219948 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab1ca47-1577-4327-8e85-6cbf07168495" path="/var/lib/kubelet/pods/fab1ca47-1577-4327-8e85-6cbf07168495/volumes" Mar 18 13:22:13 crc kubenswrapper[4921]: I0318 13:22:13.209249 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:22:13 crc kubenswrapper[4921]: E0318 13:22:13.209842 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:22:24 crc kubenswrapper[4921]: I0318 13:22:24.209549 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:22:24 crc kubenswrapper[4921]: E0318 13:22:24.210339 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:22:38 crc kubenswrapper[4921]: I0318 13:22:38.209683 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:22:38 crc kubenswrapper[4921]: E0318 13:22:38.210809 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:22:50 crc kubenswrapper[4921]: I0318 13:22:50.209455 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:22:50 crc kubenswrapper[4921]: E0318 13:22:50.210206 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:23:05 crc kubenswrapper[4921]: I0318 13:23:05.208988 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:23:05 crc kubenswrapper[4921]: E0318 13:23:05.209791 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:23:06 crc kubenswrapper[4921]: I0318 13:23:06.738441 4921 scope.go:117] "RemoveContainer" containerID="c401e9e1bbb20b82d17325861b51dfba697f6028dd04fb9f265a09982e0b5bf2" Mar 18 13:23:17 crc kubenswrapper[4921]: I0318 13:23:17.209098 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:23:17 crc kubenswrapper[4921]: E0318 13:23:17.209939 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:23:28 crc kubenswrapper[4921]: I0318 13:23:28.209254 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:23:28 crc kubenswrapper[4921]: E0318 13:23:28.209917 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:23:43 crc kubenswrapper[4921]: I0318 13:23:43.209161 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:23:43 crc kubenswrapper[4921]: E0318 13:23:43.209875 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.374419 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sr7jr"] Mar 18 13:23:46 crc kubenswrapper[4921]: E0318 13:23:46.374974 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8892ed1-7ce1-4a66-99bf-37a3a030d7ba" containerName="oc" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.374985 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8892ed1-7ce1-4a66-99bf-37a3a030d7ba" containerName="oc" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.375123 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8892ed1-7ce1-4a66-99bf-37a3a030d7ba" containerName="oc" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.376047 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.392815 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr7jr"] Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.438238 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-catalog-content\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.438523 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5kx\" (UniqueName: \"kubernetes.io/projected/bec36e79-8271-4c11-87d8-7c3f13f023cc-kube-api-access-kg5kx\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.438622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-utilities\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.539494 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-catalog-content\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.539881 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5kx\" (UniqueName: \"kubernetes.io/projected/bec36e79-8271-4c11-87d8-7c3f13f023cc-kube-api-access-kg5kx\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.540002 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-utilities\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.540192 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-catalog-content\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.540466 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-utilities\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.571220 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5kx\" (UniqueName: \"kubernetes.io/projected/bec36e79-8271-4c11-87d8-7c3f13f023cc-kube-api-access-kg5kx\") pod \"community-operators-sr7jr\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.574217 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmf5d"] Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.575985 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.586400 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmf5d"] Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.641218 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-utilities\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.641274 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-catalog-content\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.641362 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7fp8\" (UniqueName: \"kubernetes.io/projected/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-kube-api-access-q7fp8\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.702010 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.742516 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7fp8\" (UniqueName: \"kubernetes.io/projected/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-kube-api-access-q7fp8\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.742614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-utilities\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.742650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-catalog-content\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.743182 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-utilities\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.743190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-catalog-content\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.779345 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7fp8\" (UniqueName: \"kubernetes.io/projected/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-kube-api-access-q7fp8\") pod \"certified-operators-mmf5d\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:46 crc kubenswrapper[4921]: I0318 13:23:46.907960 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.135509 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sr7jr"] Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.503326 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmf5d"] Mar 18 13:23:47 crc kubenswrapper[4921]: W0318 13:23:47.513468 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc042bdeb_2daf_48a5_b20e_1b58d7b6ac96.slice/crio-d2a33f47b56f05f8770ca68f6c2d8384d777892e706312b50b80ea6c2889fdd9 WatchSource:0}: Error finding container d2a33f47b56f05f8770ca68f6c2d8384d777892e706312b50b80ea6c2889fdd9: Status 404 returned error can't find the container with id d2a33f47b56f05f8770ca68f6c2d8384d777892e706312b50b80ea6c2889fdd9 Mar 18 13:23:47 crc kubenswrapper[4921]: E0318 13:23:47.809372 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc042bdeb_2daf_48a5_b20e_1b58d7b6ac96.slice/crio-2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc042bdeb_2daf_48a5_b20e_1b58d7b6ac96.slice/crio-conmon-2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.931741 4921 generic.go:334] "Generic (PLEG): container finished" podID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerID="8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd" exitCode=0 Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.931817 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr7jr" event={"ID":"bec36e79-8271-4c11-87d8-7c3f13f023cc","Type":"ContainerDied","Data":"8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd"} Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.931845 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr7jr" event={"ID":"bec36e79-8271-4c11-87d8-7c3f13f023cc","Type":"ContainerStarted","Data":"b7a15bd896e3785f36a5dc683a807fa5424374b80c4d2b44c9b900027e370320"} Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.933370 4921 generic.go:334] "Generic (PLEG): container finished" podID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerID="2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4" exitCode=0 Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.933397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmf5d" event={"ID":"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96","Type":"ContainerDied","Data":"2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4"} Mar 18 13:23:47 crc kubenswrapper[4921]: I0318 13:23:47.933412 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmf5d" event={"ID":"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96","Type":"ContainerStarted","Data":"d2a33f47b56f05f8770ca68f6c2d8384d777892e706312b50b80ea6c2889fdd9"} Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.769309 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n92qz"] Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.774556 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.803057 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n92qz"] Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.878575 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk969\" (UniqueName: \"kubernetes.io/projected/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-kube-api-access-dk969\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.878810 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-catalog-content\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.878960 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-utilities\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.980519 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-catalog-content\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.980601 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-utilities\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.980630 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk969\" (UniqueName: \"kubernetes.io/projected/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-kube-api-access-dk969\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.981199 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-utilities\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:48 crc kubenswrapper[4921]: I0318 13:23:48.981208 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-catalog-content\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:49 crc kubenswrapper[4921]: I0318 13:23:49.006142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk969\" (UniqueName: \"kubernetes.io/projected/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-kube-api-access-dk969\") pod \"redhat-marketplace-n92qz\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:49 crc kubenswrapper[4921]: I0318 13:23:49.098227 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:49 crc kubenswrapper[4921]: I0318 13:23:49.526324 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n92qz"] Mar 18 13:23:49 crc kubenswrapper[4921]: I0318 13:23:49.955465 4921 generic.go:334] "Generic (PLEG): container finished" podID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerID="df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4" exitCode=0 Mar 18 13:23:49 crc kubenswrapper[4921]: I0318 13:23:49.955553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerDied","Data":"df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4"} Mar 18 13:23:49 crc kubenswrapper[4921]: I0318 13:23:49.955720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerStarted","Data":"56d0ef6fd7844c72dbf2f47a0a09fa7e7d16330797ca8e4b8474b53232d64239"} Mar 18 13:23:50 crc kubenswrapper[4921]: I0318 13:23:50.975094 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerStarted","Data":"30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194"} Mar 18 13:23:50 crc kubenswrapper[4921]: I0318 13:23:50.977521 4921 generic.go:334] "Generic (PLEG): container finished" podID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerID="1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef" exitCode=0 Mar 18 13:23:50 crc kubenswrapper[4921]: I0318 13:23:50.977591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr7jr" event={"ID":"bec36e79-8271-4c11-87d8-7c3f13f023cc","Type":"ContainerDied","Data":"1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef"} Mar 18 13:23:50 crc kubenswrapper[4921]: I0318 13:23:50.981166 4921 generic.go:334] "Generic (PLEG): container finished" podID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerID="dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2" exitCode=0 Mar 18 13:23:50 crc kubenswrapper[4921]: I0318 13:23:50.981239 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmf5d" event={"ID":"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96","Type":"ContainerDied","Data":"dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2"} Mar 18 13:23:51 crc kubenswrapper[4921]: I0318 13:23:51.994296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr7jr" event={"ID":"bec36e79-8271-4c11-87d8-7c3f13f023cc","Type":"ContainerStarted","Data":"3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb"} Mar 18 13:23:51 crc kubenswrapper[4921]: I0318 13:23:51.998475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmf5d" event={"ID":"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96","Type":"ContainerStarted","Data":"3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990"} Mar 18 13:23:52 crc kubenswrapper[4921]: I0318 13:23:52.001821 4921 generic.go:334] "Generic (PLEG): container finished" podID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerID="30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194" exitCode=0 Mar 18 13:23:52 crc kubenswrapper[4921]: I0318 13:23:52.001865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerDied","Data":"30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194"} Mar 18 13:23:52 crc kubenswrapper[4921]: I0318 13:23:52.038073 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sr7jr" podStartSLOduration=2.545382784 podStartE2EDuration="6.038047671s" podCreationTimestamp="2026-03-18 13:23:46 +0000 UTC" firstStartedPulling="2026-03-18 13:23:47.933208866 +0000 UTC m=+4447.483129505" lastFinishedPulling="2026-03-18 13:23:51.425873743 +0000 UTC m=+4450.975794392" observedRunningTime="2026-03-18 13:23:52.03171306 +0000 UTC m=+4451.581633699" watchObservedRunningTime="2026-03-18 13:23:52.038047671 +0000 UTC m=+4451.587968310" Mar 18 13:23:52 crc kubenswrapper[4921]: I0318 13:23:52.086532 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmf5d" podStartSLOduration=2.609076236 podStartE2EDuration="6.086506148s" podCreationTimestamp="2026-03-18 13:23:46 +0000 UTC" firstStartedPulling="2026-03-18 13:23:47.934494182 +0000 UTC m=+4447.484414821" lastFinishedPulling="2026-03-18 13:23:51.411924094 +0000 UTC m=+4450.961844733" observedRunningTime="2026-03-18 13:23:52.080695822 +0000 UTC m=+4451.630616461" watchObservedRunningTime="2026-03-18 13:23:52.086506148 +0000 UTC m=+4451.636426787" Mar 18 13:23:53 crc kubenswrapper[4921]: I0318 13:23:53.012543 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerStarted","Data":"ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17"} Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.209571 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:23:56 crc kubenswrapper[4921]: E0318 13:23:56.210063 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.702791 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.702863 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.744967 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.767675 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n92qz" podStartSLOduration=6.157288676 podStartE2EDuration="8.767653514s" podCreationTimestamp="2026-03-18 13:23:48 +0000 UTC" firstStartedPulling="2026-03-18 13:23:49.959064118 +0000 UTC m=+4449.508984757" lastFinishedPulling="2026-03-18 13:23:52.569428956 +0000 UTC m=+4452.119349595" observedRunningTime="2026-03-18 13:23:53.041198507 +0000 UTC m=+4452.591119156" watchObservedRunningTime="2026-03-18 13:23:56.767653514 +0000 UTC m=+4456.317574163" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.908752 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.909121 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:56 crc kubenswrapper[4921]: I0318 13:23:56.949294 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:57 crc kubenswrapper[4921]: I0318 13:23:57.077142 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:57 crc kubenswrapper[4921]: I0318 13:23:57.079520 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:23:58 crc kubenswrapper[4921]: I0318 13:23:58.760409 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr7jr"] Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.051677 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sr7jr" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="registry-server" containerID="cri-o://3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb" gracePeriod=2 Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.099728 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.099776 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.142138 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.366537 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmf5d"] Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.442792 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.528688 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-catalog-content\") pod \"bec36e79-8271-4c11-87d8-7c3f13f023cc\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.528770 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-utilities\") pod \"bec36e79-8271-4c11-87d8-7c3f13f023cc\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.528846 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg5kx\" (UniqueName: \"kubernetes.io/projected/bec36e79-8271-4c11-87d8-7c3f13f023cc-kube-api-access-kg5kx\") pod \"bec36e79-8271-4c11-87d8-7c3f13f023cc\" (UID: \"bec36e79-8271-4c11-87d8-7c3f13f023cc\") " Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.530661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-utilities" (OuterVolumeSpecName: "utilities") pod "bec36e79-8271-4c11-87d8-7c3f13f023cc" (UID: "bec36e79-8271-4c11-87d8-7c3f13f023cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.535392 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec36e79-8271-4c11-87d8-7c3f13f023cc-kube-api-access-kg5kx" (OuterVolumeSpecName: "kube-api-access-kg5kx") pod "bec36e79-8271-4c11-87d8-7c3f13f023cc" (UID: "bec36e79-8271-4c11-87d8-7c3f13f023cc"). InnerVolumeSpecName "kube-api-access-kg5kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.588420 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bec36e79-8271-4c11-87d8-7c3f13f023cc" (UID: "bec36e79-8271-4c11-87d8-7c3f13f023cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.630506 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.630577 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec36e79-8271-4c11-87d8-7c3f13f023cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:23:59 crc kubenswrapper[4921]: I0318 13:23:59.630588 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg5kx\" (UniqueName: \"kubernetes.io/projected/bec36e79-8271-4c11-87d8-7c3f13f023cc-kube-api-access-kg5kx\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.060762 4921 generic.go:334] "Generic (PLEG): container finished" podID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerID="3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb" exitCode=0 Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.060808 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sr7jr" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.060851 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr7jr" event={"ID":"bec36e79-8271-4c11-87d8-7c3f13f023cc","Type":"ContainerDied","Data":"3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb"} Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.060888 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sr7jr" event={"ID":"bec36e79-8271-4c11-87d8-7c3f13f023cc","Type":"ContainerDied","Data":"b7a15bd896e3785f36a5dc683a807fa5424374b80c4d2b44c9b900027e370320"} Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.060904 4921 scope.go:117] "RemoveContainer" containerID="3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.061517 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mmf5d" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="registry-server" containerID="cri-o://3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990" gracePeriod=2 Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.084781 4921 scope.go:117] "RemoveContainer" containerID="1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.102690 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sr7jr"] Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.111654 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sr7jr"] Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.131332 4921 scope.go:117] "RemoveContainer" containerID="8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.134724 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.156263 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564004-st5g2"] Mar 18 13:24:00 crc kubenswrapper[4921]: E0318 13:24:00.156699 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="extract-utilities" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.156719 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="extract-utilities" Mar 18 13:24:00 crc kubenswrapper[4921]: E0318 13:24:00.156743 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="registry-server" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.156752 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="registry-server" Mar 18 13:24:00 crc kubenswrapper[4921]: E0318 13:24:00.156769 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="extract-content" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.156778 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="extract-content" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.156977 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" containerName="registry-server" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.157535 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.164898 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.165253 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.165375 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.187827 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-st5g2"] Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.230043 4921 scope.go:117] "RemoveContainer" containerID="3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb" Mar 18 13:24:00 crc kubenswrapper[4921]: E0318 13:24:00.230845 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb\": container with ID starting with 3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb not found: ID does not exist" containerID="3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.230900 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb"} err="failed to get container status \"3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb\": rpc error: code = NotFound desc = could not find container \"3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb\": container with ID starting with 3fd410cdb57cd7229cf2cb8d0f14b42ef3db94507d2109d47de49e644fea1ceb not found: ID does not exist" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.230932 4921 scope.go:117] "RemoveContainer" containerID="1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef" Mar 18 13:24:00 crc kubenswrapper[4921]: E0318 13:24:00.231439 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef\": container with ID starting with 1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef not found: ID does not exist" containerID="1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.231466 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef"} err="failed to get container status \"1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef\": rpc error: code = NotFound desc = could not find container \"1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef\": container with ID starting with 1b27172c95778a43b8865b1d369041535dfd7e3463503a17211938e3fb04b7ef not found: ID does not exist" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.231480 4921 scope.go:117] "RemoveContainer" containerID="8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd" Mar 18 13:24:00 crc kubenswrapper[4921]: E0318 13:24:00.231752 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd\": container with ID starting with 8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd not found: ID does not exist" containerID="8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.231780 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd"} err="failed to get container status \"8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd\": rpc error: code = NotFound desc = could not find container \"8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd\": container with ID starting with 8223bd7084bf0fa4d9e8ac1f5acb90ff18cb6a8f94d83bb7de2d4fbad1b49cdd not found: ID does not exist" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.248380 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwhvc\" (UniqueName: \"kubernetes.io/projected/3ff7a1cd-28fa-4375-959e-3306256aa950-kube-api-access-bwhvc\") pod \"auto-csr-approver-29564004-st5g2\" (UID: \"3ff7a1cd-28fa-4375-959e-3306256aa950\") " pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.350295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwhvc\" (UniqueName: \"kubernetes.io/projected/3ff7a1cd-28fa-4375-959e-3306256aa950-kube-api-access-bwhvc\") pod \"auto-csr-approver-29564004-st5g2\" (UID: \"3ff7a1cd-28fa-4375-959e-3306256aa950\") " pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.370928 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwhvc\" (UniqueName: \"kubernetes.io/projected/3ff7a1cd-28fa-4375-959e-3306256aa950-kube-api-access-bwhvc\") pod \"auto-csr-approver-29564004-st5g2\" (UID: \"3ff7a1cd-28fa-4375-959e-3306256aa950\") " pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.486342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.552563 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-utilities\") pod \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.553070 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-catalog-content\") pod \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.553180 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7fp8\" (UniqueName: \"kubernetes.io/projected/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-kube-api-access-q7fp8\") pod \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\" (UID: \"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96\") " Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.553669 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-utilities" (OuterVolumeSpecName: "utilities") pod "c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" (UID: "c042bdeb-2daf-48a5-b20e-1b58d7b6ac96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.556263 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.556438 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-kube-api-access-q7fp8" (OuterVolumeSpecName: "kube-api-access-q7fp8") pod "c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" (UID: "c042bdeb-2daf-48a5-b20e-1b58d7b6ac96"). InnerVolumeSpecName "kube-api-access-q7fp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.654980 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.655015 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7fp8\" (UniqueName: \"kubernetes.io/projected/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-kube-api-access-q7fp8\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.773301 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-st5g2"] Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.810005 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" (UID: "c042bdeb-2daf-48a5-b20e-1b58d7b6ac96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:00 crc kubenswrapper[4921]: I0318 13:24:00.857926 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.071988 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-st5g2" event={"ID":"3ff7a1cd-28fa-4375-959e-3306256aa950","Type":"ContainerStarted","Data":"7e94bd3570ec62514209e09ddda66c054c5259df84d101e2a6af497e43c600f2"} Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.074475 4921 generic.go:334] "Generic (PLEG): container finished" podID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerID="3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990" exitCode=0 Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.074617 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmf5d" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.075612 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmf5d" event={"ID":"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96","Type":"ContainerDied","Data":"3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990"} Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.075749 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmf5d" event={"ID":"c042bdeb-2daf-48a5-b20e-1b58d7b6ac96","Type":"ContainerDied","Data":"d2a33f47b56f05f8770ca68f6c2d8384d777892e706312b50b80ea6c2889fdd9"} Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.075848 4921 scope.go:117] "RemoveContainer" containerID="3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.096918 4921 scope.go:117] "RemoveContainer" containerID="dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.117423 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmf5d"] Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.119916 4921 scope.go:117] "RemoveContainer" containerID="2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.126058 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mmf5d"] Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.156220 4921 scope.go:117] "RemoveContainer" containerID="3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990" Mar 18 13:24:01 crc kubenswrapper[4921]: E0318 13:24:01.156902 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990\": container with ID starting with 3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990 not found: ID does not exist" containerID="3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.156962 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990"} err="failed to get container status \"3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990\": rpc error: code = NotFound desc = could not find container \"3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990\": container with ID starting with 3384cbe46c7cbf6c60bbd67abcaf6e7bf5c41b61e8e9fdd5ca42966a8a7aa990 not found: ID does not exist" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.156996 4921 scope.go:117] "RemoveContainer" containerID="dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2" Mar 18 13:24:01 crc kubenswrapper[4921]: E0318 13:24:01.157454 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2\": container with ID starting with dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2 not found: ID does not exist" containerID="dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.157500 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2"} err="failed to get container status \"dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2\": rpc error: code = NotFound desc = could not find container \"dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2\": container with ID starting with dd37357f0cda9037423a47431f511294ee6029d48e46aaa0abb7f5a625432dc2 not found: ID does not exist" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.157527 4921 scope.go:117] "RemoveContainer" containerID="2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4" Mar 18 13:24:01 crc kubenswrapper[4921]: E0318 13:24:01.157884 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4\": container with ID starting with 2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4 not found: ID does not exist" containerID="2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.157918 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4"} err="failed to get container status \"2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4\": rpc error: code = NotFound desc = could not find container \"2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4\": container with ID starting with 2189ac4226d59762a669632f6d6a4f202e7c2f1506af8e444fea3e7bc8c5d6e4 not found: ID does not exist" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.220656 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec36e79-8271-4c11-87d8-7c3f13f023cc" path="/var/lib/kubelet/pods/bec36e79-8271-4c11-87d8-7c3f13f023cc/volumes" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.224025 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" path="/var/lib/kubelet/pods/c042bdeb-2daf-48a5-b20e-1b58d7b6ac96/volumes" Mar 18 13:24:01 crc kubenswrapper[4921]: I0318 13:24:01.765612 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n92qz"] Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.084374 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n92qz" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="registry-server" containerID="cri-o://ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17" gracePeriod=2 Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.484307 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.595574 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-utilities\") pod \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.595650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk969\" (UniqueName: \"kubernetes.io/projected/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-kube-api-access-dk969\") pod \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.595776 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-catalog-content\") pod \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\" (UID: \"3d464c7f-ade9-4b51-b2f7-764e6ed92f68\") " Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.597600 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-utilities" (OuterVolumeSpecName: "utilities") pod "3d464c7f-ade9-4b51-b2f7-764e6ed92f68" (UID: "3d464c7f-ade9-4b51-b2f7-764e6ed92f68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.625644 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d464c7f-ade9-4b51-b2f7-764e6ed92f68" (UID: "3d464c7f-ade9-4b51-b2f7-764e6ed92f68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.697082 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.697147 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.817931 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-kube-api-access-dk969" (OuterVolumeSpecName: "kube-api-access-dk969") pod "3d464c7f-ade9-4b51-b2f7-764e6ed92f68" (UID: "3d464c7f-ade9-4b51-b2f7-764e6ed92f68"). InnerVolumeSpecName "kube-api-access-dk969". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:02 crc kubenswrapper[4921]: I0318 13:24:02.899939 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk969\" (UniqueName: \"kubernetes.io/projected/3d464c7f-ade9-4b51-b2f7-764e6ed92f68-kube-api-access-dk969\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.093621 4921 generic.go:334] "Generic (PLEG): container finished" podID="3ff7a1cd-28fa-4375-959e-3306256aa950" containerID="1c0c6193aab0386358ee7e55c2dd157892b4ee764993642fbd43d3ea83d8156d" exitCode=0 Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.093730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-st5g2" event={"ID":"3ff7a1cd-28fa-4375-959e-3306256aa950","Type":"ContainerDied","Data":"1c0c6193aab0386358ee7e55c2dd157892b4ee764993642fbd43d3ea83d8156d"} Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.096864 4921 generic.go:334] "Generic (PLEG): container finished" podID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerID="ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17" exitCode=0 Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.096916 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n92qz" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.096928 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerDied","Data":"ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17"} Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.096996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n92qz" event={"ID":"3d464c7f-ade9-4b51-b2f7-764e6ed92f68","Type":"ContainerDied","Data":"56d0ef6fd7844c72dbf2f47a0a09fa7e7d16330797ca8e4b8474b53232d64239"} Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.097024 4921 scope.go:117] "RemoveContainer" containerID="ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.129218 4921 scope.go:117] "RemoveContainer" containerID="30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.137408 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n92qz"] Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.143607 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n92qz"] Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.165609 4921 scope.go:117] "RemoveContainer" containerID="df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.190482 4921 scope.go:117] "RemoveContainer" containerID="ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17" Mar 18 13:24:03 crc kubenswrapper[4921]: E0318 13:24:03.191012 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17\": container with ID starting with ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17 not found: ID does not exist" containerID="ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.191069 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17"} err="failed to get container status \"ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17\": rpc error: code = NotFound desc = could not find container \"ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17\": container with ID starting with ef1f8a844d7e9ea7db2668b4fbbf0f356d2d0c420acc596a2b8d1e892fe60c17 not found: ID does not exist" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.191102 4921 scope.go:117] "RemoveContainer" containerID="30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194" Mar 18 13:24:03 crc kubenswrapper[4921]: E0318 13:24:03.191620 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194\": container with ID starting with 30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194 not found: ID does not exist" containerID="30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.191657 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194"} err="failed to get container status \"30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194\": rpc error: code = NotFound desc = could not find container \"30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194\": container with ID starting with 30f32c34884758b0f7e484a2bdcec9ebb246ab4bbb56f70add22b1196ceb3194 not found: ID does not exist" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.191683 4921 scope.go:117] "RemoveContainer" containerID="df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4" Mar 18 13:24:03 crc kubenswrapper[4921]: E0318 13:24:03.192085 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4\": container with ID starting with df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4 not found: ID does not exist" containerID="df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.192137 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4"} err="failed to get container status \"df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4\": rpc error: code = NotFound desc = could not find container \"df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4\": container with ID starting with df6d62e5f5c41c52627679cb0adb103d813d5028b8d2f99daad9788f0f1368a4 not found: ID does not exist" Mar 18 13:24:03 crc kubenswrapper[4921]: I0318 13:24:03.220322 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" path="/var/lib/kubelet/pods/3d464c7f-ade9-4b51-b2f7-764e6ed92f68/volumes" Mar 18 13:24:04 crc kubenswrapper[4921]: I0318 13:24:04.377999 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:04 crc kubenswrapper[4921]: I0318 13:24:04.421007 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwhvc\" (UniqueName: \"kubernetes.io/projected/3ff7a1cd-28fa-4375-959e-3306256aa950-kube-api-access-bwhvc\") pod \"3ff7a1cd-28fa-4375-959e-3306256aa950\" (UID: \"3ff7a1cd-28fa-4375-959e-3306256aa950\") " Mar 18 13:24:04 crc kubenswrapper[4921]: I0318 13:24:04.427007 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff7a1cd-28fa-4375-959e-3306256aa950-kube-api-access-bwhvc" (OuterVolumeSpecName: "kube-api-access-bwhvc") pod "3ff7a1cd-28fa-4375-959e-3306256aa950" (UID: "3ff7a1cd-28fa-4375-959e-3306256aa950"). InnerVolumeSpecName "kube-api-access-bwhvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:04 crc kubenswrapper[4921]: I0318 13:24:04.523677 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwhvc\" (UniqueName: \"kubernetes.io/projected/3ff7a1cd-28fa-4375-959e-3306256aa950-kube-api-access-bwhvc\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:05 crc kubenswrapper[4921]: I0318 13:24:05.116663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-st5g2" event={"ID":"3ff7a1cd-28fa-4375-959e-3306256aa950","Type":"ContainerDied","Data":"7e94bd3570ec62514209e09ddda66c054c5259df84d101e2a6af497e43c600f2"} Mar 18 13:24:05 crc kubenswrapper[4921]: I0318 13:24:05.116762 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e94bd3570ec62514209e09ddda66c054c5259df84d101e2a6af497e43c600f2" Mar 18 13:24:05 crc kubenswrapper[4921]: I0318 13:24:05.116852 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-st5g2" Mar 18 13:24:05 crc kubenswrapper[4921]: I0318 13:24:05.446230 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-npd9w"] Mar 18 13:24:05 crc kubenswrapper[4921]: I0318 13:24:05.451980 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-npd9w"] Mar 18 13:24:07 crc kubenswrapper[4921]: I0318 13:24:07.218750 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a6cde1-043f-420d-8653-f624145df1b2" path="/var/lib/kubelet/pods/57a6cde1-043f-420d-8653-f624145df1b2/volumes" Mar 18 13:24:11 crc kubenswrapper[4921]: I0318 13:24:11.214165 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:24:11 crc kubenswrapper[4921]: E0318 13:24:11.214779 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.420008 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nqvbw"] Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421011 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="extract-content" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421026 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="extract-content" Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421039 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="extract-utilities" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421048 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="extract-utilities" Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421059 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="extract-utilities" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421068 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="extract-utilities" Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421086 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="registry-server" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421094 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="registry-server" Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421125 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff7a1cd-28fa-4375-959e-3306256aa950" containerName="oc" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421134 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff7a1cd-28fa-4375-959e-3306256aa950" containerName="oc" Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421145 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="extract-content" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421152 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="extract-content" Mar 18 13:24:20 crc kubenswrapper[4921]: E0318 13:24:20.421172 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="registry-server" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421179 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="registry-server" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421321 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d464c7f-ade9-4b51-b2f7-764e6ed92f68" containerName="registry-server" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421340 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff7a1cd-28fa-4375-959e-3306256aa950" containerName="oc" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.421353 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c042bdeb-2daf-48a5-b20e-1b58d7b6ac96" containerName="registry-server" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.422330 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.440225 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqvbw"] Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.458919 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-utilities\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.459016 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-catalog-content\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.459049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4tb\" (UniqueName: \"kubernetes.io/projected/87fb390f-ea33-4fc0-9da6-aa7065d5d459-kube-api-access-fq4tb\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.560264 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-catalog-content\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.560831 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4tb\" (UniqueName: \"kubernetes.io/projected/87fb390f-ea33-4fc0-9da6-aa7065d5d459-kube-api-access-fq4tb\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.560772 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-catalog-content\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.561256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-utilities\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.561565 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-utilities\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.582445 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4tb\" (UniqueName: \"kubernetes.io/projected/87fb390f-ea33-4fc0-9da6-aa7065d5d459-kube-api-access-fq4tb\") pod \"redhat-operators-nqvbw\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.748332 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:20 crc kubenswrapper[4921]: I0318 13:24:20.971389 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nqvbw"] Mar 18 13:24:21 crc kubenswrapper[4921]: I0318 13:24:21.226230 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerStarted","Data":"29e01acc357145b27f4668926485d5d5f6452411bad72917a5132230ec8454f7"} Mar 18 13:24:22 crc kubenswrapper[4921]: I0318 13:24:22.235958 4921 generic.go:334] "Generic (PLEG): container finished" podID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerID="96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f" exitCode=0 Mar 18 13:24:22 crc kubenswrapper[4921]: I0318 13:24:22.236014 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerDied","Data":"96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f"} Mar 18 13:24:23 crc kubenswrapper[4921]: I0318 13:24:23.256421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerStarted","Data":"4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f"} Mar 18 13:24:24 crc kubenswrapper[4921]: I0318 13:24:24.264629 4921 generic.go:334] "Generic (PLEG): container finished" podID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerID="4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f" exitCode=0 Mar 18 13:24:24 crc kubenswrapper[4921]: I0318 13:24:24.264686 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerDied","Data":"4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f"} Mar 18 13:24:25 crc kubenswrapper[4921]: I0318 13:24:25.284921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerStarted","Data":"be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7"} Mar 18 13:24:25 crc kubenswrapper[4921]: I0318 13:24:25.309794 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nqvbw" podStartSLOduration=2.772403305 podStartE2EDuration="5.309771386s" podCreationTimestamp="2026-03-18 13:24:20 +0000 UTC" firstStartedPulling="2026-03-18 13:24:22.237430836 +0000 UTC m=+4481.787351475" lastFinishedPulling="2026-03-18 13:24:24.774798907 +0000 UTC m=+4484.324719556" observedRunningTime="2026-03-18 13:24:25.303932269 +0000 UTC m=+4484.853852918" watchObservedRunningTime="2026-03-18 13:24:25.309771386 +0000 UTC m=+4484.859692025" Mar 18 13:24:26 crc kubenswrapper[4921]: I0318 13:24:26.209566 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:24:26 crc kubenswrapper[4921]: E0318 13:24:26.209856 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:24:30 crc kubenswrapper[4921]: I0318 13:24:30.748549 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:30 crc kubenswrapper[4921]: I0318 13:24:30.749186 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:30 crc kubenswrapper[4921]: I0318 13:24:30.897207 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:31 crc kubenswrapper[4921]: I0318 13:24:31.371380 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:31 crc kubenswrapper[4921]: I0318 13:24:31.421392 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqvbw"] Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.344622 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nqvbw" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="registry-server" containerID="cri-o://be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7" gracePeriod=2 Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.725085 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.752060 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-catalog-content\") pod \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.752127 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-utilities\") pod \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.752236 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq4tb\" (UniqueName: \"kubernetes.io/projected/87fb390f-ea33-4fc0-9da6-aa7065d5d459-kube-api-access-fq4tb\") pod \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\" (UID: \"87fb390f-ea33-4fc0-9da6-aa7065d5d459\") " Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.754216 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-utilities" (OuterVolumeSpecName: "utilities") pod "87fb390f-ea33-4fc0-9da6-aa7065d5d459" (UID: "87fb390f-ea33-4fc0-9da6-aa7065d5d459"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.768452 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fb390f-ea33-4fc0-9da6-aa7065d5d459-kube-api-access-fq4tb" (OuterVolumeSpecName: "kube-api-access-fq4tb") pod "87fb390f-ea33-4fc0-9da6-aa7065d5d459" (UID: "87fb390f-ea33-4fc0-9da6-aa7065d5d459"). InnerVolumeSpecName "kube-api-access-fq4tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.854286 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:33 crc kubenswrapper[4921]: I0318 13:24:33.854324 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq4tb\" (UniqueName: \"kubernetes.io/projected/87fb390f-ea33-4fc0-9da6-aa7065d5d459-kube-api-access-fq4tb\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.353186 4921 generic.go:334] "Generic (PLEG): container finished" podID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerID="be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7" exitCode=0 Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.353226 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerDied","Data":"be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7"} Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.353273 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nqvbw" event={"ID":"87fb390f-ea33-4fc0-9da6-aa7065d5d459","Type":"ContainerDied","Data":"29e01acc357145b27f4668926485d5d5f6452411bad72917a5132230ec8454f7"} Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.353295 4921 scope.go:117] "RemoveContainer" containerID="be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.353296 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nqvbw" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.371822 4921 scope.go:117] "RemoveContainer" containerID="4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.392172 4921 scope.go:117] "RemoveContainer" containerID="96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.421048 4921 scope.go:117] "RemoveContainer" containerID="be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7" Mar 18 13:24:34 crc kubenswrapper[4921]: E0318 13:24:34.421864 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7\": container with ID starting with be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7 not found: ID does not exist" containerID="be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.421906 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7"} err="failed to get container status \"be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7\": rpc error: code = NotFound desc = could not find container \"be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7\": container with ID starting with be7da61a1a2eb004d8af8f537bbcd78e6579f3fa07d6b9c488d4ffe135a5a6b7 not found: ID does not exist" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.421937 4921 scope.go:117] "RemoveContainer" containerID="4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f" Mar 18 13:24:34 crc kubenswrapper[4921]: E0318 13:24:34.422352 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f\": container with ID starting with 4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f not found: ID does not exist" containerID="4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.422387 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f"} err="failed to get container status \"4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f\": rpc error: code = NotFound desc = could not find container \"4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f\": container with ID starting with 4790be030be555c4efe466d336ef09062e19c94dd00b440135e4ac3ab01a256f not found: ID does not exist" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.422407 4921 scope.go:117] "RemoveContainer" containerID="96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f" Mar 18 13:24:34 crc kubenswrapper[4921]: E0318 13:24:34.422708 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f\": container with ID starting with 96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f not found: ID does not exist" containerID="96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f" Mar 18 13:24:34 crc kubenswrapper[4921]: I0318 13:24:34.422745 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f"} err="failed to get container status \"96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f\": rpc error: code = NotFound desc = could not find container \"96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f\": container with ID starting with 96f7e288e157ef2516fe120340ffdcb7aabc927fc85c7d5d1eb729315ce5591f not found: ID does not exist" Mar 18 13:24:35 crc kubenswrapper[4921]: I0318 13:24:35.285396 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87fb390f-ea33-4fc0-9da6-aa7065d5d459" (UID: "87fb390f-ea33-4fc0-9da6-aa7065d5d459"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:35 crc kubenswrapper[4921]: I0318 13:24:35.380745 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87fb390f-ea33-4fc0-9da6-aa7065d5d459-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:35 crc kubenswrapper[4921]: I0318 13:24:35.594645 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nqvbw"] Mar 18 13:24:35 crc kubenswrapper[4921]: I0318 13:24:35.602902 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nqvbw"] Mar 18 13:24:37 crc kubenswrapper[4921]: I0318 13:24:37.208896 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:24:37 crc kubenswrapper[4921]: E0318 13:24:37.209244 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:24:37 crc kubenswrapper[4921]: I0318 13:24:37.217712 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" path="/var/lib/kubelet/pods/87fb390f-ea33-4fc0-9da6-aa7065d5d459/volumes" Mar 18 13:24:49 crc kubenswrapper[4921]: I0318 13:24:49.211126 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:24:49 crc kubenswrapper[4921]: E0318 13:24:49.211947 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:25:02 crc kubenswrapper[4921]: I0318 13:25:02.209350 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:25:02 crc kubenswrapper[4921]: E0318 13:25:02.209960 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:25:06 crc kubenswrapper[4921]: I0318 13:25:06.845635 4921 scope.go:117] "RemoveContainer" containerID="f7df322ca43f842d0ecec0f104fce020eb2962957933461a7ed4c504826bc12e" Mar 18 13:25:17 crc kubenswrapper[4921]: I0318 13:25:17.208845 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:25:17 crc kubenswrapper[4921]: E0318 13:25:17.209776 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:25:31 crc kubenswrapper[4921]: I0318 13:25:31.212874 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:25:31 crc kubenswrapper[4921]: E0318 13:25:31.214837 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:25:46 crc kubenswrapper[4921]: I0318 13:25:46.210371 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:25:46 crc kubenswrapper[4921]: E0318 13:25:46.211686 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.146690 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564006-4dwvr"] Mar 18 13:26:00 crc kubenswrapper[4921]: E0318 13:26:00.147755 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="registry-server" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.147778 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="registry-server" Mar 18 13:26:00 crc kubenswrapper[4921]: E0318 13:26:00.147819 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="extract-content" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.147831 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="extract-content" Mar 18 13:26:00 crc kubenswrapper[4921]: E0318 13:26:00.147867 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="extract-utilities" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.147879 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="extract-utilities" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.148104 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fb390f-ea33-4fc0-9da6-aa7065d5d459" containerName="registry-server" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.149380 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.152135 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.152501 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.152574 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.175701 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-4dwvr"] Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.208902 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:26:00 crc kubenswrapper[4921]: E0318 13:26:00.209211 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.342987 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6vgq\" (UniqueName: \"kubernetes.io/projected/34af6292-66b5-4f8d-8b9e-bf5d13acfa99-kube-api-access-w6vgq\") pod \"auto-csr-approver-29564006-4dwvr\" (UID: \"34af6292-66b5-4f8d-8b9e-bf5d13acfa99\") " pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.443815 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6vgq\" (UniqueName: \"kubernetes.io/projected/34af6292-66b5-4f8d-8b9e-bf5d13acfa99-kube-api-access-w6vgq\") pod \"auto-csr-approver-29564006-4dwvr\" (UID: \"34af6292-66b5-4f8d-8b9e-bf5d13acfa99\") " pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.464626 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6vgq\" (UniqueName: \"kubernetes.io/projected/34af6292-66b5-4f8d-8b9e-bf5d13acfa99-kube-api-access-w6vgq\") pod \"auto-csr-approver-29564006-4dwvr\" (UID: \"34af6292-66b5-4f8d-8b9e-bf5d13acfa99\") " pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.475271 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.889584 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-4dwvr"] Mar 18 13:26:00 crc kubenswrapper[4921]: I0318 13:26:00.964349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" event={"ID":"34af6292-66b5-4f8d-8b9e-bf5d13acfa99","Type":"ContainerStarted","Data":"ef1151a35c25d67dfe52c9f4bc75b336d17ba0c65ad255821d2a7fa2b03938dd"} Mar 18 13:26:02 crc kubenswrapper[4921]: I0318 13:26:02.982334 4921 generic.go:334] "Generic (PLEG): container finished" podID="34af6292-66b5-4f8d-8b9e-bf5d13acfa99" containerID="1ede5f2894b83a5bb9aab68dadb4c0cb0df72ffec8a90be8ef60958bf20b45f3" exitCode=0 Mar 18 13:26:02 crc kubenswrapper[4921]: I0318 13:26:02.982463 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" event={"ID":"34af6292-66b5-4f8d-8b9e-bf5d13acfa99","Type":"ContainerDied","Data":"1ede5f2894b83a5bb9aab68dadb4c0cb0df72ffec8a90be8ef60958bf20b45f3"} Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.306854 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.501905 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6vgq\" (UniqueName: \"kubernetes.io/projected/34af6292-66b5-4f8d-8b9e-bf5d13acfa99-kube-api-access-w6vgq\") pod \"34af6292-66b5-4f8d-8b9e-bf5d13acfa99\" (UID: \"34af6292-66b5-4f8d-8b9e-bf5d13acfa99\") " Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.508526 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34af6292-66b5-4f8d-8b9e-bf5d13acfa99-kube-api-access-w6vgq" (OuterVolumeSpecName: "kube-api-access-w6vgq") pod "34af6292-66b5-4f8d-8b9e-bf5d13acfa99" (UID: "34af6292-66b5-4f8d-8b9e-bf5d13acfa99"). InnerVolumeSpecName "kube-api-access-w6vgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.604231 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6vgq\" (UniqueName: \"kubernetes.io/projected/34af6292-66b5-4f8d-8b9e-bf5d13acfa99-kube-api-access-w6vgq\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.998400 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" event={"ID":"34af6292-66b5-4f8d-8b9e-bf5d13acfa99","Type":"ContainerDied","Data":"ef1151a35c25d67dfe52c9f4bc75b336d17ba0c65ad255821d2a7fa2b03938dd"} Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.998465 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1151a35c25d67dfe52c9f4bc75b336d17ba0c65ad255821d2a7fa2b03938dd" Mar 18 13:26:04 crc kubenswrapper[4921]: I0318 13:26:04.999052 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-4dwvr" Mar 18 13:26:05 crc kubenswrapper[4921]: I0318 13:26:05.383896 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-6zz8n"] Mar 18 13:26:05 crc kubenswrapper[4921]: I0318 13:26:05.389083 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-6zz8n"] Mar 18 13:26:07 crc kubenswrapper[4921]: I0318 13:26:07.221009 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a" path="/var/lib/kubelet/pods/9c30f5b5-5ad9-4ad4-8bf6-dce5930c901a/volumes" Mar 18 13:26:13 crc kubenswrapper[4921]: I0318 13:26:13.209531 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:26:13 crc kubenswrapper[4921]: E0318 13:26:13.210360 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:26:25 crc kubenswrapper[4921]: I0318 13:26:25.219845 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:26:26 crc kubenswrapper[4921]: I0318 13:26:26.153099 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"6801f03ae22e22ea24d6b66923ad809a09a2350a73e361f997823c799fdb1b50"} Mar 18 13:27:06 crc kubenswrapper[4921]: I0318 13:27:06.957779 4921 scope.go:117] "RemoveContainer" containerID="b762e6bb38a3b639d806ea7c63e429447112243ec855a9630e3eecd7ddf697a2" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.610068 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-d5qjc"] Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.616208 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-d5qjc"] Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.757745 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6f449"] Mar 18 13:27:07 crc kubenswrapper[4921]: E0318 13:27:07.758683 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34af6292-66b5-4f8d-8b9e-bf5d13acfa99" containerName="oc" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.758706 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="34af6292-66b5-4f8d-8b9e-bf5d13acfa99" containerName="oc" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.758878 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="34af6292-66b5-4f8d-8b9e-bf5d13acfa99" containerName="oc" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.759382 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.761407 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.762206 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.762869 4921 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-rlgxp" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.766908 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.780139 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6f449"] Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.830993 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e22d5cd-769b-4ade-a159-93455404f3d4-node-mnt\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.831057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e22d5cd-769b-4ade-a159-93455404f3d4-crc-storage\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.831097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdtn\" (UniqueName: \"kubernetes.io/projected/5e22d5cd-769b-4ade-a159-93455404f3d4-kube-api-access-zzdtn\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.932536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e22d5cd-769b-4ade-a159-93455404f3d4-node-mnt\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.932943 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e22d5cd-769b-4ade-a159-93455404f3d4-crc-storage\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.932874 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e22d5cd-769b-4ade-a159-93455404f3d4-node-mnt\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.933028 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdtn\" (UniqueName: \"kubernetes.io/projected/5e22d5cd-769b-4ade-a159-93455404f3d4-kube-api-access-zzdtn\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.934436 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e22d5cd-769b-4ade-a159-93455404f3d4-crc-storage\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:07 crc kubenswrapper[4921]: I0318 13:27:07.958159 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdtn\" (UniqueName: \"kubernetes.io/projected/5e22d5cd-769b-4ade-a159-93455404f3d4-kube-api-access-zzdtn\") pod \"crc-storage-crc-6f449\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:08 crc kubenswrapper[4921]: I0318 13:27:08.076104 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:08 crc kubenswrapper[4921]: I0318 13:27:08.575309 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6f449"] Mar 18 13:27:08 crc kubenswrapper[4921]: I0318 13:27:08.576677 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:27:09 crc kubenswrapper[4921]: I0318 13:27:09.220183 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d0865c-c0b1-486c-8e28-92a275c035b6" path="/var/lib/kubelet/pods/a7d0865c-c0b1-486c-8e28-92a275c035b6/volumes" Mar 18 13:27:09 crc kubenswrapper[4921]: I0318 13:27:09.465015 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6f449" event={"ID":"5e22d5cd-769b-4ade-a159-93455404f3d4","Type":"ContainerStarted","Data":"45afe41cd8b776b43df0a877fbaea22f8218a3eed66c3d79fb6c39b63d14a08f"} Mar 18 13:27:10 crc kubenswrapper[4921]: I0318 13:27:10.474200 4921 generic.go:334] "Generic (PLEG): container finished" podID="5e22d5cd-769b-4ade-a159-93455404f3d4" containerID="7f6606f7b853c58e72012a0fbe3a6466b5e4e2f95e2cd52ae1d264e058a92f55" exitCode=0 Mar 18 13:27:10 crc kubenswrapper[4921]: I0318 13:27:10.474278 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6f449" event={"ID":"5e22d5cd-769b-4ade-a159-93455404f3d4","Type":"ContainerDied","Data":"7f6606f7b853c58e72012a0fbe3a6466b5e4e2f95e2cd52ae1d264e058a92f55"} Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.025169 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.102822 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e22d5cd-769b-4ade-a159-93455404f3d4-crc-storage\") pod \"5e22d5cd-769b-4ade-a159-93455404f3d4\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.102903 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e22d5cd-769b-4ade-a159-93455404f3d4-node-mnt\") pod \"5e22d5cd-769b-4ade-a159-93455404f3d4\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.102988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzdtn\" (UniqueName: \"kubernetes.io/projected/5e22d5cd-769b-4ade-a159-93455404f3d4-kube-api-access-zzdtn\") pod \"5e22d5cd-769b-4ade-a159-93455404f3d4\" (UID: \"5e22d5cd-769b-4ade-a159-93455404f3d4\") " Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.103067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e22d5cd-769b-4ade-a159-93455404f3d4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5e22d5cd-769b-4ade-a159-93455404f3d4" (UID: "5e22d5cd-769b-4ade-a159-93455404f3d4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.103216 4921 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e22d5cd-769b-4ade-a159-93455404f3d4-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.110966 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e22d5cd-769b-4ade-a159-93455404f3d4-kube-api-access-zzdtn" (OuterVolumeSpecName: "kube-api-access-zzdtn") pod "5e22d5cd-769b-4ade-a159-93455404f3d4" (UID: "5e22d5cd-769b-4ade-a159-93455404f3d4"). InnerVolumeSpecName "kube-api-access-zzdtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.132496 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e22d5cd-769b-4ade-a159-93455404f3d4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5e22d5cd-769b-4ade-a159-93455404f3d4" (UID: "5e22d5cd-769b-4ade-a159-93455404f3d4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.204322 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzdtn\" (UniqueName: \"kubernetes.io/projected/5e22d5cd-769b-4ade-a159-93455404f3d4-kube-api-access-zzdtn\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.204689 4921 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e22d5cd-769b-4ade-a159-93455404f3d4-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.492200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6f449" event={"ID":"5e22d5cd-769b-4ade-a159-93455404f3d4","Type":"ContainerDied","Data":"45afe41cd8b776b43df0a877fbaea22f8218a3eed66c3d79fb6c39b63d14a08f"} Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.492277 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45afe41cd8b776b43df0a877fbaea22f8218a3eed66c3d79fb6c39b63d14a08f" Mar 18 13:27:12 crc kubenswrapper[4921]: I0318 13:27:12.492293 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6f449" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.097133 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-6f449"] Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.103737 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-6f449"] Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.209357 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p8sxl"] Mar 18 13:27:14 crc kubenswrapper[4921]: E0318 13:27:14.209667 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e22d5cd-769b-4ade-a159-93455404f3d4" containerName="storage" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.209681 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e22d5cd-769b-4ade-a159-93455404f3d4" containerName="storage" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.209827 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e22d5cd-769b-4ade-a159-93455404f3d4" containerName="storage" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.210328 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.212370 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.212919 4921 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-rlgxp" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.213595 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.218012 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.218514 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p8sxl"] Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.239360 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1c25b5c7-d927-4495-8f80-e2207730825a-node-mnt\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.239456 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1c25b5c7-d927-4495-8f80-e2207730825a-crc-storage\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.239513 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnsmj\" (UniqueName: \"kubernetes.io/projected/1c25b5c7-d927-4495-8f80-e2207730825a-kube-api-access-lnsmj\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.348467 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1c25b5c7-d927-4495-8f80-e2207730825a-node-mnt\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.349066 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1c25b5c7-d927-4495-8f80-e2207730825a-node-mnt\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.350511 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1c25b5c7-d927-4495-8f80-e2207730825a-crc-storage\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.349094 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1c25b5c7-d927-4495-8f80-e2207730825a-crc-storage\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.360369 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnsmj\" (UniqueName: \"kubernetes.io/projected/1c25b5c7-d927-4495-8f80-e2207730825a-kube-api-access-lnsmj\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.394091 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnsmj\" (UniqueName: \"kubernetes.io/projected/1c25b5c7-d927-4495-8f80-e2207730825a-kube-api-access-lnsmj\") pod \"crc-storage-crc-p8sxl\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.530637 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:14 crc kubenswrapper[4921]: I0318 13:27:14.939850 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p8sxl"] Mar 18 13:27:14 crc kubenswrapper[4921]: W0318 13:27:14.946305 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c25b5c7_d927_4495_8f80_e2207730825a.slice/crio-314b3afeea7ba41633ac21e167a61c2501be4a31773aac6d1cdb3e5ad8e40416 WatchSource:0}: Error finding container 314b3afeea7ba41633ac21e167a61c2501be4a31773aac6d1cdb3e5ad8e40416: Status 404 returned error can't find the container with id 314b3afeea7ba41633ac21e167a61c2501be4a31773aac6d1cdb3e5ad8e40416 Mar 18 13:27:15 crc kubenswrapper[4921]: I0318 13:27:15.220602 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e22d5cd-769b-4ade-a159-93455404f3d4" path="/var/lib/kubelet/pods/5e22d5cd-769b-4ade-a159-93455404f3d4/volumes" Mar 18 13:27:15 crc kubenswrapper[4921]: I0318 13:27:15.512527 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p8sxl" event={"ID":"1c25b5c7-d927-4495-8f80-e2207730825a","Type":"ContainerStarted","Data":"314b3afeea7ba41633ac21e167a61c2501be4a31773aac6d1cdb3e5ad8e40416"} Mar 18 13:27:16 crc kubenswrapper[4921]: I0318 13:27:16.523369 4921 generic.go:334] "Generic (PLEG): container finished" podID="1c25b5c7-d927-4495-8f80-e2207730825a" containerID="0d246c02212bb0548e6b4b8fa0145bb6ede54beb06728cbffe5869cdfd6b52e8" exitCode=0 Mar 18 13:27:16 crc kubenswrapper[4921]: I0318 13:27:16.523445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p8sxl" event={"ID":"1c25b5c7-d927-4495-8f80-e2207730825a","Type":"ContainerDied","Data":"0d246c02212bb0548e6b4b8fa0145bb6ede54beb06728cbffe5869cdfd6b52e8"} Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.798963 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.910423 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1c25b5c7-d927-4495-8f80-e2207730825a-node-mnt\") pod \"1c25b5c7-d927-4495-8f80-e2207730825a\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.910508 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnsmj\" (UniqueName: \"kubernetes.io/projected/1c25b5c7-d927-4495-8f80-e2207730825a-kube-api-access-lnsmj\") pod \"1c25b5c7-d927-4495-8f80-e2207730825a\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.910535 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1c25b5c7-d927-4495-8f80-e2207730825a-crc-storage\") pod \"1c25b5c7-d927-4495-8f80-e2207730825a\" (UID: \"1c25b5c7-d927-4495-8f80-e2207730825a\") " Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.910584 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c25b5c7-d927-4495-8f80-e2207730825a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1c25b5c7-d927-4495-8f80-e2207730825a" (UID: "1c25b5c7-d927-4495-8f80-e2207730825a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.910754 4921 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1c25b5c7-d927-4495-8f80-e2207730825a-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.917504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c25b5c7-d927-4495-8f80-e2207730825a-kube-api-access-lnsmj" (OuterVolumeSpecName: "kube-api-access-lnsmj") pod "1c25b5c7-d927-4495-8f80-e2207730825a" (UID: "1c25b5c7-d927-4495-8f80-e2207730825a"). InnerVolumeSpecName "kube-api-access-lnsmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:27:17 crc kubenswrapper[4921]: I0318 13:27:17.935919 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c25b5c7-d927-4495-8f80-e2207730825a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1c25b5c7-d927-4495-8f80-e2207730825a" (UID: "1c25b5c7-d927-4495-8f80-e2207730825a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:27:18 crc kubenswrapper[4921]: I0318 13:27:18.012784 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnsmj\" (UniqueName: \"kubernetes.io/projected/1c25b5c7-d927-4495-8f80-e2207730825a-kube-api-access-lnsmj\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:18 crc kubenswrapper[4921]: I0318 13:27:18.012840 4921 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1c25b5c7-d927-4495-8f80-e2207730825a-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 13:27:18 crc kubenswrapper[4921]: I0318 13:27:18.539298 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p8sxl" event={"ID":"1c25b5c7-d927-4495-8f80-e2207730825a","Type":"ContainerDied","Data":"314b3afeea7ba41633ac21e167a61c2501be4a31773aac6d1cdb3e5ad8e40416"} Mar 18 13:27:18 crc kubenswrapper[4921]: I0318 13:27:18.539618 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="314b3afeea7ba41633ac21e167a61c2501be4a31773aac6d1cdb3e5ad8e40416" Mar 18 13:27:18 crc kubenswrapper[4921]: I0318 13:27:18.539359 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p8sxl" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.160871 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564008-pchdh"] Mar 18 13:28:00 crc kubenswrapper[4921]: E0318 13:28:00.162006 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c25b5c7-d927-4495-8f80-e2207730825a" containerName="storage" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.162024 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c25b5c7-d927-4495-8f80-e2207730825a" containerName="storage" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.162237 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c25b5c7-d927-4495-8f80-e2207730825a" containerName="storage" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.162829 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.166784 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.177944 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.178178 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.182754 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-pchdh"] Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.278952 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4cq\" (UniqueName: \"kubernetes.io/projected/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7-kube-api-access-tk4cq\") pod \"auto-csr-approver-29564008-pchdh\" (UID: \"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7\") " pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.380263 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk4cq\" (UniqueName: \"kubernetes.io/projected/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7-kube-api-access-tk4cq\") pod \"auto-csr-approver-29564008-pchdh\" (UID: \"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7\") " pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.402288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk4cq\" (UniqueName: \"kubernetes.io/projected/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7-kube-api-access-tk4cq\") pod \"auto-csr-approver-29564008-pchdh\" (UID: \"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7\") " pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.490559 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:00 crc kubenswrapper[4921]: I0318 13:28:00.925997 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-pchdh"] Mar 18 13:28:01 crc kubenswrapper[4921]: I0318 13:28:01.842384 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-pchdh" event={"ID":"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7","Type":"ContainerStarted","Data":"896cb5f9dbab506fc4365777b1190346b3a7293587086fe2b69044f62c4dfbfb"} Mar 18 13:28:02 crc kubenswrapper[4921]: I0318 13:28:02.851714 4921 generic.go:334] "Generic (PLEG): container finished" podID="9152eddf-1c76-4e8f-a9a7-edee46f7c8e7" containerID="8e77ac4016b30e9544bbb440dcfa7597adad767976d604498bf3a6c63197cb84" exitCode=0 Mar 18 13:28:02 crc kubenswrapper[4921]: I0318 13:28:02.851769 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-pchdh" event={"ID":"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7","Type":"ContainerDied","Data":"8e77ac4016b30e9544bbb440dcfa7597adad767976d604498bf3a6c63197cb84"} Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.196417 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.249284 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk4cq\" (UniqueName: \"kubernetes.io/projected/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7-kube-api-access-tk4cq\") pod \"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7\" (UID: \"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7\") " Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.256912 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7-kube-api-access-tk4cq" (OuterVolumeSpecName: "kube-api-access-tk4cq") pod "9152eddf-1c76-4e8f-a9a7-edee46f7c8e7" (UID: "9152eddf-1c76-4e8f-a9a7-edee46f7c8e7"). InnerVolumeSpecName "kube-api-access-tk4cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.350929 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk4cq\" (UniqueName: \"kubernetes.io/projected/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7-kube-api-access-tk4cq\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.870304 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-pchdh" event={"ID":"9152eddf-1c76-4e8f-a9a7-edee46f7c8e7","Type":"ContainerDied","Data":"896cb5f9dbab506fc4365777b1190346b3a7293587086fe2b69044f62c4dfbfb"} Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.870345 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-pchdh" Mar 18 13:28:04 crc kubenswrapper[4921]: I0318 13:28:04.870350 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="896cb5f9dbab506fc4365777b1190346b3a7293587086fe2b69044f62c4dfbfb" Mar 18 13:28:05 crc kubenswrapper[4921]: I0318 13:28:05.277750 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-lcsn8"] Mar 18 13:28:05 crc kubenswrapper[4921]: I0318 13:28:05.283083 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-lcsn8"] Mar 18 13:28:07 crc kubenswrapper[4921]: I0318 13:28:07.019617 4921 scope.go:117] "RemoveContainer" containerID="fdbcb2743b54a0e1c0974fb26a0cef13d8a15b40737f6b8c1e9edc268cfbd6d3" Mar 18 13:28:07 crc kubenswrapper[4921]: I0318 13:28:07.219026 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8892ed1-7ce1-4a66-99bf-37a3a030d7ba" path="/var/lib/kubelet/pods/c8892ed1-7ce1-4a66-99bf-37a3a030d7ba/volumes" Mar 18 13:28:47 crc kubenswrapper[4921]: I0318 13:28:47.081402 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:28:47 crc kubenswrapper[4921]: I0318 13:28:47.082009 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:29:07 crc kubenswrapper[4921]: I0318 13:29:07.077496 4921 scope.go:117] "RemoveContainer" containerID="9d70895c43def28365a4044f0ad4b0a4eb26203c34ab13189a0d49cc8f8b82c9" Mar 18 13:29:17 crc kubenswrapper[4921]: I0318 13:29:17.080967 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:29:17 crc kubenswrapper[4921]: I0318 13:29:17.081524 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.081210 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.082047 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.082298 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.083395 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6801f03ae22e22ea24d6b66923ad809a09a2350a73e361f997823c799fdb1b50"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.083536 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://6801f03ae22e22ea24d6b66923ad809a09a2350a73e361f997823c799fdb1b50" gracePeriod=600 Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.709194 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="6801f03ae22e22ea24d6b66923ad809a09a2350a73e361f997823c799fdb1b50" exitCode=0 Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.709270 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"6801f03ae22e22ea24d6b66923ad809a09a2350a73e361f997823c799fdb1b50"} Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.710041 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d"} Mar 18 13:29:47 crc kubenswrapper[4921]: I0318 13:29:47.710073 4921 scope.go:117] "RemoveContainer" containerID="a39bf8f887bbee7aece7205919fc35c3bf60d8332bfa36ff1c554a11189e3a5d" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.160425 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564010-857qb"] Mar 18 13:30:00 crc kubenswrapper[4921]: E0318 13:30:00.161709 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9152eddf-1c76-4e8f-a9a7-edee46f7c8e7" containerName="oc" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.161730 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9152eddf-1c76-4e8f-a9a7-edee46f7c8e7" containerName="oc" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.162000 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9152eddf-1c76-4e8f-a9a7-edee46f7c8e7" containerName="oc" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.162778 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.165697 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.166072 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.166231 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.172763 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6"] Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.174030 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.179031 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.179849 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.184711 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-857qb"] Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.200603 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6"] Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.248599 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2jb\" (UniqueName: \"kubernetes.io/projected/c2fc77ad-30a6-4212-ac1e-10df854673d4-kube-api-access-jp2jb\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.248653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2fc77ad-30a6-4212-ac1e-10df854673d4-secret-volume\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.248707 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2fc77ad-30a6-4212-ac1e-10df854673d4-config-volume\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.249026 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2gp\" (UniqueName: \"kubernetes.io/projected/726e7c6c-3238-4ca6-b386-be62ced029f8-kube-api-access-gk2gp\") pod \"auto-csr-approver-29564010-857qb\" (UID: \"726e7c6c-3238-4ca6-b386-be62ced029f8\") " pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.350740 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2gp\" (UniqueName: \"kubernetes.io/projected/726e7c6c-3238-4ca6-b386-be62ced029f8-kube-api-access-gk2gp\") pod \"auto-csr-approver-29564010-857qb\" (UID: \"726e7c6c-3238-4ca6-b386-be62ced029f8\") " pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.350821 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2jb\" (UniqueName: \"kubernetes.io/projected/c2fc77ad-30a6-4212-ac1e-10df854673d4-kube-api-access-jp2jb\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.350848 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2fc77ad-30a6-4212-ac1e-10df854673d4-secret-volume\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.350874 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2fc77ad-30a6-4212-ac1e-10df854673d4-config-volume\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.351801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2fc77ad-30a6-4212-ac1e-10df854673d4-config-volume\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.357348 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2fc77ad-30a6-4212-ac1e-10df854673d4-secret-volume\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.368089 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2gp\" (UniqueName: \"kubernetes.io/projected/726e7c6c-3238-4ca6-b386-be62ced029f8-kube-api-access-gk2gp\") pod \"auto-csr-approver-29564010-857qb\" (UID: \"726e7c6c-3238-4ca6-b386-be62ced029f8\") " pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.368269 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2jb\" (UniqueName: \"kubernetes.io/projected/c2fc77ad-30a6-4212-ac1e-10df854673d4-kube-api-access-jp2jb\") pod \"collect-profiles-29564010-qxjb6\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.486456 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.500433 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.922267 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6"] Mar 18 13:30:00 crc kubenswrapper[4921]: W0318 13:30:00.924044 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2fc77ad_30a6_4212_ac1e_10df854673d4.slice/crio-df2e6a2e063f055c321e5015b5123cedf242e39732de6649436fd6101052a0b6 WatchSource:0}: Error finding container df2e6a2e063f055c321e5015b5123cedf242e39732de6649436fd6101052a0b6: Status 404 returned error can't find the container with id df2e6a2e063f055c321e5015b5123cedf242e39732de6649436fd6101052a0b6 Mar 18 13:30:00 crc kubenswrapper[4921]: I0318 13:30:00.972521 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-857qb"] Mar 18 13:30:00 crc kubenswrapper[4921]: W0318 13:30:00.975031 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod726e7c6c_3238_4ca6_b386_be62ced029f8.slice/crio-8c07cbfbc2a452348e2b4cb493e3e27a1d62ede91bc5b0c3ed10c892e2222a42 WatchSource:0}: Error finding container 8c07cbfbc2a452348e2b4cb493e3e27a1d62ede91bc5b0c3ed10c892e2222a42: Status 404 returned error can't find the container with id 8c07cbfbc2a452348e2b4cb493e3e27a1d62ede91bc5b0c3ed10c892e2222a42 Mar 18 13:30:01 crc kubenswrapper[4921]: I0318 13:30:01.835907 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-857qb" event={"ID":"726e7c6c-3238-4ca6-b386-be62ced029f8","Type":"ContainerStarted","Data":"8c07cbfbc2a452348e2b4cb493e3e27a1d62ede91bc5b0c3ed10c892e2222a42"} Mar 18 13:30:01 crc kubenswrapper[4921]: I0318 13:30:01.837880 4921 generic.go:334] "Generic (PLEG): container finished" podID="c2fc77ad-30a6-4212-ac1e-10df854673d4" containerID="442a1e494b4f5a6b449196769045786fbeeb468ad31f06294237bd669c6bb640" exitCode=0 Mar 18 13:30:01 crc kubenswrapper[4921]: I0318 13:30:01.837912 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" event={"ID":"c2fc77ad-30a6-4212-ac1e-10df854673d4","Type":"ContainerDied","Data":"442a1e494b4f5a6b449196769045786fbeeb468ad31f06294237bd669c6bb640"} Mar 18 13:30:01 crc kubenswrapper[4921]: I0318 13:30:01.837928 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" event={"ID":"c2fc77ad-30a6-4212-ac1e-10df854673d4","Type":"ContainerStarted","Data":"df2e6a2e063f055c321e5015b5123cedf242e39732de6649436fd6101052a0b6"} Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.112858 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.186798 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2jb\" (UniqueName: \"kubernetes.io/projected/c2fc77ad-30a6-4212-ac1e-10df854673d4-kube-api-access-jp2jb\") pod \"c2fc77ad-30a6-4212-ac1e-10df854673d4\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.186853 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2fc77ad-30a6-4212-ac1e-10df854673d4-config-volume\") pod \"c2fc77ad-30a6-4212-ac1e-10df854673d4\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.186888 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2fc77ad-30a6-4212-ac1e-10df854673d4-secret-volume\") pod \"c2fc77ad-30a6-4212-ac1e-10df854673d4\" (UID: \"c2fc77ad-30a6-4212-ac1e-10df854673d4\") " Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.187971 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fc77ad-30a6-4212-ac1e-10df854673d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2fc77ad-30a6-4212-ac1e-10df854673d4" (UID: "c2fc77ad-30a6-4212-ac1e-10df854673d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.192270 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fc77ad-30a6-4212-ac1e-10df854673d4-kube-api-access-jp2jb" (OuterVolumeSpecName: "kube-api-access-jp2jb") pod "c2fc77ad-30a6-4212-ac1e-10df854673d4" (UID: "c2fc77ad-30a6-4212-ac1e-10df854673d4"). InnerVolumeSpecName "kube-api-access-jp2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.192910 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fc77ad-30a6-4212-ac1e-10df854673d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2fc77ad-30a6-4212-ac1e-10df854673d4" (UID: "c2fc77ad-30a6-4212-ac1e-10df854673d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.288289 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp2jb\" (UniqueName: \"kubernetes.io/projected/c2fc77ad-30a6-4212-ac1e-10df854673d4-kube-api-access-jp2jb\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.288322 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2fc77ad-30a6-4212-ac1e-10df854673d4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4921]: I0318 13:30:03.288335 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2fc77ad-30a6-4212-ac1e-10df854673d4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:04 crc kubenswrapper[4921]: I0318 13:30:04.197514 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5"] Mar 18 13:30:04 crc kubenswrapper[4921]: I0318 13:30:04.203719 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-2ntx5"] Mar 18 13:30:04 crc kubenswrapper[4921]: I0318 13:30:04.805716 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" event={"ID":"c2fc77ad-30a6-4212-ac1e-10df854673d4","Type":"ContainerDied","Data":"df2e6a2e063f055c321e5015b5123cedf242e39732de6649436fd6101052a0b6"} Mar 18 13:30:04 crc kubenswrapper[4921]: I0318 13:30:04.806040 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df2e6a2e063f055c321e5015b5123cedf242e39732de6649436fd6101052a0b6" Mar 18 13:30:04 crc kubenswrapper[4921]: I0318 13:30:04.805787 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6" Mar 18 13:30:05 crc kubenswrapper[4921]: I0318 13:30:05.218285 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148f26a3-3a38-4c59-b627-ed51585387fe" path="/var/lib/kubelet/pods/148f26a3-3a38-4c59-b627-ed51585387fe/volumes" Mar 18 13:30:05 crc kubenswrapper[4921]: I0318 13:30:05.814302 4921 generic.go:334] "Generic (PLEG): container finished" podID="726e7c6c-3238-4ca6-b386-be62ced029f8" containerID="95a31e4dc8fc7181298fa4d06a6762fb24ad4d13ae8e95ad1b8ca74e4fd0dde4" exitCode=0 Mar 18 13:30:05 crc kubenswrapper[4921]: I0318 13:30:05.814368 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-857qb" event={"ID":"726e7c6c-3238-4ca6-b386-be62ced029f8","Type":"ContainerDied","Data":"95a31e4dc8fc7181298fa4d06a6762fb24ad4d13ae8e95ad1b8ca74e4fd0dde4"} Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.116530 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.133014 4921 scope.go:117] "RemoveContainer" containerID="bf94485548949f36d9248c8f52d899bdc9cb04baa9a98f85413de2c3d0446a8f" Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.192313 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk2gp\" (UniqueName: \"kubernetes.io/projected/726e7c6c-3238-4ca6-b386-be62ced029f8-kube-api-access-gk2gp\") pod \"726e7c6c-3238-4ca6-b386-be62ced029f8\" (UID: \"726e7c6c-3238-4ca6-b386-be62ced029f8\") " Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.198581 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726e7c6c-3238-4ca6-b386-be62ced029f8-kube-api-access-gk2gp" (OuterVolumeSpecName: "kube-api-access-gk2gp") pod "726e7c6c-3238-4ca6-b386-be62ced029f8" (UID: "726e7c6c-3238-4ca6-b386-be62ced029f8"). InnerVolumeSpecName "kube-api-access-gk2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.295897 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk2gp\" (UniqueName: \"kubernetes.io/projected/726e7c6c-3238-4ca6-b386-be62ced029f8-kube-api-access-gk2gp\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.846005 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-857qb" event={"ID":"726e7c6c-3238-4ca6-b386-be62ced029f8","Type":"ContainerDied","Data":"8c07cbfbc2a452348e2b4cb493e3e27a1d62ede91bc5b0c3ed10c892e2222a42"} Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.846071 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c07cbfbc2a452348e2b4cb493e3e27a1d62ede91bc5b0c3ed10c892e2222a42" Mar 18 13:30:07 crc kubenswrapper[4921]: I0318 13:30:07.846177 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-857qb" Mar 18 13:30:08 crc kubenswrapper[4921]: I0318 13:30:08.182707 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-st5g2"] Mar 18 13:30:08 crc kubenswrapper[4921]: I0318 13:30:08.189557 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-st5g2"] Mar 18 13:30:09 crc kubenswrapper[4921]: I0318 13:30:09.217744 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff7a1cd-28fa-4375-959e-3306256aa950" path="/var/lib/kubelet/pods/3ff7a1cd-28fa-4375-959e-3306256aa950/volumes" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.188458 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xlgb8"] Mar 18 13:30:32 crc kubenswrapper[4921]: E0318 13:30:32.189358 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fc77ad-30a6-4212-ac1e-10df854673d4" containerName="collect-profiles" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.189372 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fc77ad-30a6-4212-ac1e-10df854673d4" containerName="collect-profiles" Mar 18 13:30:32 crc kubenswrapper[4921]: E0318 13:30:32.189391 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726e7c6c-3238-4ca6-b386-be62ced029f8" containerName="oc" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.189398 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="726e7c6c-3238-4ca6-b386-be62ced029f8" containerName="oc" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.189540 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="726e7c6c-3238-4ca6-b386-be62ced029f8" containerName="oc" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.189565 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fc77ad-30a6-4212-ac1e-10df854673d4" containerName="collect-profiles" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.190270 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.194828 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.195858 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.195996 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.196139 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-gtv8k" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.196352 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.213778 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xlgb8"] Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.371524 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj54h\" (UniqueName: \"kubernetes.io/projected/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-kube-api-access-kj54h\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.371609 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-config\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.372234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.476728 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj54h\" (UniqueName: \"kubernetes.io/projected/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-kube-api-access-kj54h\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.476806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-config\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.476849 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.478073 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-config\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.478140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.489507 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-6jk6h"] Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.491373 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.499711 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-6jk6h"] Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.508001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj54h\" (UniqueName: \"kubernetes.io/projected/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-kube-api-access-kj54h\") pod \"dnsmasq-dns-5d7b5456f5-xlgb8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.517095 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.578264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5pr\" (UniqueName: \"kubernetes.io/projected/c90674ee-98de-43c5-9346-e4a0d002bcfc-kube-api-access-rk5pr\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.578330 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.578405 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-config\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.685745 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-config\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.686286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5pr\" (UniqueName: \"kubernetes.io/projected/c90674ee-98de-43c5-9346-e4a0d002bcfc-kube-api-access-rk5pr\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.686409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.687013 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-config\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.687583 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.708240 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5pr\" (UniqueName: \"kubernetes.io/projected/c90674ee-98de-43c5-9346-e4a0d002bcfc-kube-api-access-rk5pr\") pod \"dnsmasq-dns-98ddfc8f-6jk6h\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:32 crc kubenswrapper[4921]: I0318 13:30:32.809807 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.040619 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xlgb8"] Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.285310 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-6jk6h"] Mar 18 13:30:33 crc kubenswrapper[4921]: W0318 13:30:33.302086 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90674ee_98de_43c5_9346_e4a0d002bcfc.slice/crio-e38b2540adfe31883fdcf4f86bf87fdc2e655c5f87653a22cbc7be168ffc1b01 WatchSource:0}: Error finding container e38b2540adfe31883fdcf4f86bf87fdc2e655c5f87653a22cbc7be168ffc1b01: Status 404 returned error can't find the container with id e38b2540adfe31883fdcf4f86bf87fdc2e655c5f87653a22cbc7be168ffc1b01 Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.320467 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.322192 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.334402 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tmcsw" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.334720 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.334897 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.335070 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.335270 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.338177 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.498683 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.498767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d66395fa-b434-4b08-ae7f-d061fd4fd559-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.498806 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.498839 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d66395fa-b434-4b08-ae7f-d061fd4fd559-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.498879 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.499000 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.499035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mg9\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-kube-api-access-76mg9\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.499192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.499261 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.601864 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d66395fa-b434-4b08-ae7f-d061fd4fd559-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602349 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602396 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d66395fa-b434-4b08-ae7f-d061fd4fd559-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602436 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602468 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602497 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76mg9\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-kube-api-access-76mg9\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602546 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602579 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.602633 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.603568 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.603749 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.603778 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.603944 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.607193 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d66395fa-b434-4b08-ae7f-d061fd4fd559-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.607732 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d66395fa-b434-4b08-ae7f-d061fd4fd559-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.614988 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.615041 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ece1b5101c10ac68a0b85cec9466746f312c5b8b54289739ca02c967f81234e/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.618220 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.628646 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76mg9\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-kube-api-access-76mg9\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.648193 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.661067 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.662427 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.664454 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.664617 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.666201 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ldptp" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.666259 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.679456 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.682041 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810251 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810309 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810400 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810495 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810578 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfkjq\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-kube-api-access-kfkjq\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.810734 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912096 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912252 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912285 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfkjq\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-kube-api-access-kfkjq\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912343 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912425 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.912448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.913767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.914257 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.915070 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.915643 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.916158 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.916194 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/86d5c7f1f45d520f6f366c0ada6bb6fa4bdfe9bcfe5bf864278546983f3060d6/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.919329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.923046 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.939996 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfkjq\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-kube-api-access-kfkjq\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.941979 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.946896 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:30:33 crc kubenswrapper[4921]: I0318 13:30:33.948514 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.029552 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.037808 4921 generic.go:334] "Generic (PLEG): container finished" podID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerID="eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850" exitCode=0 Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.037925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" event={"ID":"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8","Type":"ContainerDied","Data":"eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850"} Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.037960 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" event={"ID":"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8","Type":"ContainerStarted","Data":"72d9184c4d0db18d25650a7be1d0d1ea808a919c1e9387bb1b1bfaf0e0adac1e"} Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.045572 4921 generic.go:334] "Generic (PLEG): container finished" podID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerID="494bc219cf6a6f975ab96a4222a5629103e78ce646d1bc6f7e2da8e910b77cf8" exitCode=0 Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.046080 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" event={"ID":"c90674ee-98de-43c5-9346-e4a0d002bcfc","Type":"ContainerDied","Data":"494bc219cf6a6f975ab96a4222a5629103e78ce646d1bc6f7e2da8e910b77cf8"} Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.046137 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" event={"ID":"c90674ee-98de-43c5-9346-e4a0d002bcfc","Type":"ContainerStarted","Data":"e38b2540adfe31883fdcf4f86bf87fdc2e655c5f87653a22cbc7be168ffc1b01"} Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.468896 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:30:34 crc kubenswrapper[4921]: W0318 13:30:34.470960 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd66395fa_b434_4b08_ae7f_d061fd4fd559.slice/crio-d52f8688d964922a484d83e265947bd859c388f3436cc677fff11cc06259df07 WatchSource:0}: Error finding container d52f8688d964922a484d83e265947bd859c388f3436cc677fff11cc06259df07: Status 404 returned error can't find the container with id d52f8688d964922a484d83e265947bd859c388f3436cc677fff11cc06259df07 Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.579532 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:30:34 crc kubenswrapper[4921]: W0318 13:30:34.587619 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32df9bf_c9fd_447c_ac1c_b13028f2e12d.slice/crio-41eaa1cca6163a8361465a9b4a12478ff3c517170a43426132e59d7841a44d00 WatchSource:0}: Error finding container 41eaa1cca6163a8361465a9b4a12478ff3c517170a43426132e59d7841a44d00: Status 404 returned error can't find the container with id 41eaa1cca6163a8361465a9b4a12478ff3c517170a43426132e59d7841a44d00 Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.836718 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.837869 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.843646 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.844016 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.844448 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.844714 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zz99z" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.853919 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.875852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.938921 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8fb\" (UniqueName: \"kubernetes.io/projected/974cac56-e861-4c87-98aa-b5d0d098fa15-kube-api-access-dd8fb\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.938998 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-operator-scripts\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.939101 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-kolla-config\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.939154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-config-data-default\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.939207 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974cac56-e861-4c87-98aa-b5d0d098fa15-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.939430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/974cac56-e861-4c87-98aa-b5d0d098fa15-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.939542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/974cac56-e861-4c87-98aa-b5d0d098fa15-config-data-generated\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:34 crc kubenswrapper[4921]: I0318 13:30:34.939734 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041725 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/974cac56-e861-4c87-98aa-b5d0d098fa15-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041781 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/974cac56-e861-4c87-98aa-b5d0d098fa15-config-data-generated\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041894 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8fb\" (UniqueName: \"kubernetes.io/projected/974cac56-e861-4c87-98aa-b5d0d098fa15-kube-api-access-dd8fb\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041924 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-operator-scripts\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041963 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-kolla-config\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.041990 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-config-data-default\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.042034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974cac56-e861-4c87-98aa-b5d0d098fa15-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.042329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/974cac56-e861-4c87-98aa-b5d0d098fa15-config-data-generated\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.043521 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-config-data-default\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.043821 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-operator-scripts\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.043999 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/974cac56-e861-4c87-98aa-b5d0d098fa15-kolla-config\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.048753 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/974cac56-e861-4c87-98aa-b5d0d098fa15-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.048826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974cac56-e861-4c87-98aa-b5d0d098fa15-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.052768 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.052818 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f03df1162994f4228821191c899c30e233aa3eae1039d580ba37f0746650e0d1/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.057534 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e32df9bf-c9fd-447c-ac1c-b13028f2e12d","Type":"ContainerStarted","Data":"41eaa1cca6163a8361465a9b4a12478ff3c517170a43426132e59d7841a44d00"} Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.059961 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d66395fa-b434-4b08-ae7f-d061fd4fd559","Type":"ContainerStarted","Data":"d52f8688d964922a484d83e265947bd859c388f3436cc677fff11cc06259df07"} Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.069303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" event={"ID":"c90674ee-98de-43c5-9346-e4a0d002bcfc","Type":"ContainerStarted","Data":"0ea3e4113e3de356fc4f546636e220a227e9c083ab51cf12f10226cf602b69cc"} Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.069818 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8fb\" (UniqueName: \"kubernetes.io/projected/974cac56-e861-4c87-98aa-b5d0d098fa15-kube-api-access-dd8fb\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.070154 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.074716 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" event={"ID":"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8","Type":"ContainerStarted","Data":"6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343"} Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.074946 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.102674 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" podStartSLOduration=3.102652278 podStartE2EDuration="3.102652278s" podCreationTimestamp="2026-03-18 13:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:35.09854194 +0000 UTC m=+4854.648462699" watchObservedRunningTime="2026-03-18 13:30:35.102652278 +0000 UTC m=+4854.652572917" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.109196 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48196f5a-804b-46c3-a482-ff01f3b97cf8\") pod \"openstack-galera-0\" (UID: \"974cac56-e861-4c87-98aa-b5d0d098fa15\") " pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.134009 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" podStartSLOduration=3.133984764 podStartE2EDuration="3.133984764s" podCreationTimestamp="2026-03-18 13:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:35.127725995 +0000 UTC m=+4854.677646634" watchObservedRunningTime="2026-03-18 13:30:35.133984764 +0000 UTC m=+4854.683905403" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.167165 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.169217 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.172157 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.188820 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.188995 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-77pbg" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.198778 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.357102 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/097f7983-d88f-4756-af26-aa3ca22d865a-kolla-config\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.357279 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/097f7983-d88f-4756-af26-aa3ca22d865a-config-data\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.357351 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ns7\" (UniqueName: \"kubernetes.io/projected/097f7983-d88f-4756-af26-aa3ca22d865a-kube-api-access-z7ns7\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.458289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/097f7983-d88f-4756-af26-aa3ca22d865a-kolla-config\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.458414 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/097f7983-d88f-4756-af26-aa3ca22d865a-config-data\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.458457 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ns7\" (UniqueName: \"kubernetes.io/projected/097f7983-d88f-4756-af26-aa3ca22d865a-kube-api-access-z7ns7\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.459767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/097f7983-d88f-4756-af26-aa3ca22d865a-kolla-config\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.459798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/097f7983-d88f-4756-af26-aa3ca22d865a-config-data\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.566126 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ns7\" (UniqueName: \"kubernetes.io/projected/097f7983-d88f-4756-af26-aa3ca22d865a-kube-api-access-z7ns7\") pod \"memcached-0\" (UID: \"097f7983-d88f-4756-af26-aa3ca22d865a\") " pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.678454 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 13:30:35 crc kubenswrapper[4921]: W0318 13:30:35.878105 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974cac56_e861_4c87_98aa_b5d0d098fa15.slice/crio-ce0e637029e255724f960d07ef8c71ef425b27e9f1769e30de3374b25e71be00 WatchSource:0}: Error finding container ce0e637029e255724f960d07ef8c71ef425b27e9f1769e30de3374b25e71be00: Status 404 returned error can't find the container with id ce0e637029e255724f960d07ef8c71ef425b27e9f1769e30de3374b25e71be00 Mar 18 13:30:35 crc kubenswrapper[4921]: I0318 13:30:35.872745 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.084288 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e32df9bf-c9fd-447c-ac1c-b13028f2e12d","Type":"ContainerStarted","Data":"244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70"} Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.087488 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"974cac56-e861-4c87-98aa-b5d0d098fa15","Type":"ContainerStarted","Data":"c5692b0cf7be049427d90cf97ad690288b4f41d9782f2f26ec52d5310592b111"} Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.087595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"974cac56-e861-4c87-98aa-b5d0d098fa15","Type":"ContainerStarted","Data":"ce0e637029e255724f960d07ef8c71ef425b27e9f1769e30de3374b25e71be00"} Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.093194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d66395fa-b434-4b08-ae7f-d061fd4fd559","Type":"ContainerStarted","Data":"e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa"} Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.159160 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.435028 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.437920 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.442025 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-d6jsx" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.442657 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.443538 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.443775 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.444647 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.597371 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.598729 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d10202-0f8e-4004-aede-0b5ed2c63589-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.598842 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d10202-0f8e-4004-aede-0b5ed2c63589-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.599029 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.599203 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.599241 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78d10202-0f8e-4004-aede-0b5ed2c63589-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.599300 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mwc\" (UniqueName: \"kubernetes.io/projected/78d10202-0f8e-4004-aede-0b5ed2c63589-kube-api-access-24mwc\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.599458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d10202-0f8e-4004-aede-0b5ed2c63589-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700639 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d10202-0f8e-4004-aede-0b5ed2c63589-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700679 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700723 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78d10202-0f8e-4004-aede-0b5ed2c63589-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700768 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24mwc\" (UniqueName: \"kubernetes.io/projected/78d10202-0f8e-4004-aede-0b5ed2c63589-kube-api-access-24mwc\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.700811 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.701462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.701591 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78d10202-0f8e-4004-aede-0b5ed2c63589-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.702594 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.702749 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78d10202-0f8e-4004-aede-0b5ed2c63589-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.707441 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d10202-0f8e-4004-aede-0b5ed2c63589-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.707552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d10202-0f8e-4004-aede-0b5ed2c63589-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.717326 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.717403 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0fa8df7509389df228ee43ce2ccbe79732e3d0f9d95067e76456068dc4f0c915/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.719185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mwc\" (UniqueName: \"kubernetes.io/projected/78d10202-0f8e-4004-aede-0b5ed2c63589-kube-api-access-24mwc\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.754836 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf8e58ed-8a78-4752-9cb3-a93fbe83498a\") pod \"openstack-cell1-galera-0\" (UID: \"78d10202-0f8e-4004-aede-0b5ed2c63589\") " pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:36 crc kubenswrapper[4921]: I0318 13:30:36.848097 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:37 crc kubenswrapper[4921]: I0318 13:30:37.098526 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"097f7983-d88f-4756-af26-aa3ca22d865a","Type":"ContainerStarted","Data":"063d0c4538f04da4e73a43d2d519b446ec29fc655eb039344eeb84f00da4dd75"} Mar 18 13:30:37 crc kubenswrapper[4921]: I0318 13:30:37.098998 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"097f7983-d88f-4756-af26-aa3ca22d865a","Type":"ContainerStarted","Data":"68859af920ed26d6b4a9fd4f78d58b62b662f3af1cc898a20775cc9cf9154423"} Mar 18 13:30:37 crc kubenswrapper[4921]: I0318 13:30:37.115499 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.115478407 podStartE2EDuration="2.115478407s" podCreationTimestamp="2026-03-18 13:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:37.114957832 +0000 UTC m=+4856.664878471" watchObservedRunningTime="2026-03-18 13:30:37.115478407 +0000 UTC m=+4856.665399046" Mar 18 13:30:37 crc kubenswrapper[4921]: I0318 13:30:37.288759 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 13:30:38 crc kubenswrapper[4921]: I0318 13:30:38.113255 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78d10202-0f8e-4004-aede-0b5ed2c63589","Type":"ContainerStarted","Data":"ad2cd720fc7b42f3bfed7e7bd63d325143e85c3d8ce9e14436f7f398e1a00f7a"} Mar 18 13:30:38 crc kubenswrapper[4921]: I0318 13:30:38.113693 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78d10202-0f8e-4004-aede-0b5ed2c63589","Type":"ContainerStarted","Data":"cde55d6bcbe0626862fd43131979640af6714f75f74c71126c9733c80e2d8951"} Mar 18 13:30:38 crc kubenswrapper[4921]: I0318 13:30:38.113716 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 13:30:41 crc kubenswrapper[4921]: I0318 13:30:41.137360 4921 generic.go:334] "Generic (PLEG): container finished" podID="974cac56-e861-4c87-98aa-b5d0d098fa15" containerID="c5692b0cf7be049427d90cf97ad690288b4f41d9782f2f26ec52d5310592b111" exitCode=0 Mar 18 13:30:41 crc kubenswrapper[4921]: I0318 13:30:41.137576 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"974cac56-e861-4c87-98aa-b5d0d098fa15","Type":"ContainerDied","Data":"c5692b0cf7be049427d90cf97ad690288b4f41d9782f2f26ec52d5310592b111"} Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.149591 4921 generic.go:334] "Generic (PLEG): container finished" podID="78d10202-0f8e-4004-aede-0b5ed2c63589" containerID="ad2cd720fc7b42f3bfed7e7bd63d325143e85c3d8ce9e14436f7f398e1a00f7a" exitCode=0 Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.149724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78d10202-0f8e-4004-aede-0b5ed2c63589","Type":"ContainerDied","Data":"ad2cd720fc7b42f3bfed7e7bd63d325143e85c3d8ce9e14436f7f398e1a00f7a"} Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.152471 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"974cac56-e861-4c87-98aa-b5d0d098fa15","Type":"ContainerStarted","Data":"83d2682d84b8c70c135e7091c6457c2f7e72e367c1cbd0a8a7133468afb18c74"} Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.229401 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.229368788 podStartE2EDuration="9.229368788s" podCreationTimestamp="2026-03-18 13:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:42.222073709 +0000 UTC m=+4861.771994358" watchObservedRunningTime="2026-03-18 13:30:42.229368788 +0000 UTC m=+4861.779289427" Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.519299 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.812460 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:30:42 crc kubenswrapper[4921]: I0318 13:30:42.903946 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xlgb8"] Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.161670 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerName="dnsmasq-dns" containerID="cri-o://6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343" gracePeriod=10 Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.161699 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78d10202-0f8e-4004-aede-0b5ed2c63589","Type":"ContainerStarted","Data":"b443b451d80cf68a695758e261eae458f79fc41f28e37fc140b1f60851c4e7d3"} Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.189849 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.189829223 podStartE2EDuration="8.189829223s" podCreationTimestamp="2026-03-18 13:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:30:43.181702941 +0000 UTC m=+4862.731623580" watchObservedRunningTime="2026-03-18 13:30:43.189829223 +0000 UTC m=+4862.739749862" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.583000 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.731490 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-config\") pod \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.731632 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-dns-svc\") pod \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.731680 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj54h\" (UniqueName: \"kubernetes.io/projected/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-kube-api-access-kj54h\") pod \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\" (UID: \"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8\") " Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.743819 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-kube-api-access-kj54h" (OuterVolumeSpecName: "kube-api-access-kj54h") pod "6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" (UID: "6785ef5a-a480-4c2d-a4a1-16a6de00b5c8"). InnerVolumeSpecName "kube-api-access-kj54h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.776889 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" (UID: "6785ef5a-a480-4c2d-a4a1-16a6de00b5c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.784593 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-config" (OuterVolumeSpecName: "config") pod "6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" (UID: "6785ef5a-a480-4c2d-a4a1-16a6de00b5c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.833465 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.833498 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:43 crc kubenswrapper[4921]: I0318 13:30:43.833512 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj54h\" (UniqueName: \"kubernetes.io/projected/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8-kube-api-access-kj54h\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.176373 4921 generic.go:334] "Generic (PLEG): container finished" podID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerID="6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343" exitCode=0 Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.176435 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" event={"ID":"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8","Type":"ContainerDied","Data":"6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343"} Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.176470 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" event={"ID":"6785ef5a-a480-4c2d-a4a1-16a6de00b5c8","Type":"ContainerDied","Data":"72d9184c4d0db18d25650a7be1d0d1ea808a919c1e9387bb1b1bfaf0e0adac1e"} Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.176492 4921 scope.go:117] "RemoveContainer" containerID="6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.176644 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xlgb8" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.196022 4921 scope.go:117] "RemoveContainer" containerID="eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.206146 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xlgb8"] Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.211551 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xlgb8"] Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.227560 4921 scope.go:117] "RemoveContainer" containerID="6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343" Mar 18 13:30:44 crc kubenswrapper[4921]: E0318 13:30:44.227975 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343\": container with ID starting with 6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343 not found: ID does not exist" containerID="6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.228004 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343"} err="failed to get container status \"6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343\": rpc error: code = NotFound desc = could not find container \"6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343\": container with ID starting with 6d392a86b14e2697e4fad02b280597acd3c068873cae0148d5227856ab3a1343 not found: ID does not exist" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.228023 4921 scope.go:117] "RemoveContainer" containerID="eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850" Mar 18 13:30:44 crc kubenswrapper[4921]: E0318 13:30:44.228269 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850\": container with ID starting with eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850 not found: ID does not exist" containerID="eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850" Mar 18 13:30:44 crc kubenswrapper[4921]: I0318 13:30:44.228295 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850"} err="failed to get container status \"eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850\": rpc error: code = NotFound desc = could not find container \"eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850\": container with ID starting with eb9b72cf8da9037f00b9175068a75e377d19609000aeef133b82d924d8763850 not found: ID does not exist" Mar 18 13:30:45 crc kubenswrapper[4921]: I0318 13:30:45.173073 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 13:30:45 crc kubenswrapper[4921]: I0318 13:30:45.173146 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 13:30:45 crc kubenswrapper[4921]: I0318 13:30:45.222761 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" path="/var/lib/kubelet/pods/6785ef5a-a480-4c2d-a4a1-16a6de00b5c8/volumes" Mar 18 13:30:45 crc kubenswrapper[4921]: I0318 13:30:45.680326 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 13:30:46 crc kubenswrapper[4921]: I0318 13:30:46.848556 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:46 crc kubenswrapper[4921]: I0318 13:30:46.848978 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:46 crc kubenswrapper[4921]: I0318 13:30:46.924748 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:47 crc kubenswrapper[4921]: I0318 13:30:47.267751 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 13:30:47 crc kubenswrapper[4921]: I0318 13:30:47.543288 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 13:30:47 crc kubenswrapper[4921]: I0318 13:30:47.629480 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.804312 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8wcgj"] Mar 18 13:30:53 crc kubenswrapper[4921]: E0318 13:30:53.805087 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerName="init" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.805099 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerName="init" Mar 18 13:30:53 crc kubenswrapper[4921]: E0318 13:30:53.805148 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerName="dnsmasq-dns" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.805154 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerName="dnsmasq-dns" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.805301 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6785ef5a-a480-4c2d-a4a1-16a6de00b5c8" containerName="dnsmasq-dns" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.805797 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.810375 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.817565 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8wcgj"] Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.912264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqf52\" (UniqueName: \"kubernetes.io/projected/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-kube-api-access-cqf52\") pod \"root-account-create-update-8wcgj\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:53 crc kubenswrapper[4921]: I0318 13:30:53.912979 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-operator-scripts\") pod \"root-account-create-update-8wcgj\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:54 crc kubenswrapper[4921]: I0318 13:30:54.014523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqf52\" (UniqueName: \"kubernetes.io/projected/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-kube-api-access-cqf52\") pod \"root-account-create-update-8wcgj\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:54 crc kubenswrapper[4921]: I0318 13:30:54.014605 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-operator-scripts\") pod \"root-account-create-update-8wcgj\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:54 crc kubenswrapper[4921]: I0318 13:30:54.015284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-operator-scripts\") pod \"root-account-create-update-8wcgj\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:54 crc kubenswrapper[4921]: I0318 13:30:54.039337 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqf52\" (UniqueName: \"kubernetes.io/projected/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-kube-api-access-cqf52\") pod \"root-account-create-update-8wcgj\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:54 crc kubenswrapper[4921]: I0318 13:30:54.124725 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:54 crc kubenswrapper[4921]: I0318 13:30:54.571750 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8wcgj"] Mar 18 13:30:55 crc kubenswrapper[4921]: I0318 13:30:55.272058 4921 generic.go:334] "Generic (PLEG): container finished" podID="c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" containerID="04524717c112007438a97dc276099a2b35127aa2a98b868a2d2228311fdfc91a" exitCode=0 Mar 18 13:30:55 crc kubenswrapper[4921]: I0318 13:30:55.272104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wcgj" event={"ID":"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73","Type":"ContainerDied","Data":"04524717c112007438a97dc276099a2b35127aa2a98b868a2d2228311fdfc91a"} Mar 18 13:30:55 crc kubenswrapper[4921]: I0318 13:30:55.272153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wcgj" event={"ID":"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73","Type":"ContainerStarted","Data":"b2f2d73e7e11f901283b48544f27fe2c70f33bdfa5d8701cee9a41df1d59ff46"} Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.699808 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wcgj" Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.721211 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqf52\" (UniqueName: \"kubernetes.io/projected/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-kube-api-access-cqf52\") pod \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.721323 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-operator-scripts\") pod \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\" (UID: \"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73\") " Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.722527 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" (UID: "c10ec6da-60e0-4d68-a4af-a8c4f35cbc73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.733390 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-kube-api-access-cqf52" (OuterVolumeSpecName: "kube-api-access-cqf52") pod "c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" (UID: "c10ec6da-60e0-4d68-a4af-a8c4f35cbc73"). InnerVolumeSpecName "kube-api-access-cqf52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.822385 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqf52\" (UniqueName: \"kubernetes.io/projected/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-kube-api-access-cqf52\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:56 crc kubenswrapper[4921]: I0318 13:30:56.822427 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:57 crc kubenswrapper[4921]: I0318 13:30:57.287935 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wcgj" event={"ID":"c10ec6da-60e0-4d68-a4af-a8c4f35cbc73","Type":"ContainerDied","Data":"b2f2d73e7e11f901283b48544f27fe2c70f33bdfa5d8701cee9a41df1d59ff46"} Mar 18 13:30:57 crc kubenswrapper[4921]: I0318 13:30:57.288417 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f2d73e7e11f901283b48544f27fe2c70f33bdfa5d8701cee9a41df1d59ff46" Mar 18 13:30:57 crc kubenswrapper[4921]: I0318 13:30:57.288066 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wcgj" Mar 18 13:31:00 crc kubenswrapper[4921]: I0318 13:31:00.326335 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8wcgj"] Mar 18 13:31:00 crc kubenswrapper[4921]: I0318 13:31:00.335740 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8wcgj"] Mar 18 13:31:01 crc kubenswrapper[4921]: I0318 13:31:01.218478 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" path="/var/lib/kubelet/pods/c10ec6da-60e0-4d68-a4af-a8c4f35cbc73/volumes" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.326867 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8m58f"] Mar 18 13:31:05 crc kubenswrapper[4921]: E0318 13:31:05.327602 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" containerName="mariadb-account-create-update" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.327615 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" containerName="mariadb-account-create-update" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.327781 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c10ec6da-60e0-4d68-a4af-a8c4f35cbc73" containerName="mariadb-account-create-update" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.328284 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.330674 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.345999 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8m58f"] Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.432769 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41abfa9f-2f52-410c-858f-97cc71b3a8c0-operator-scripts\") pod \"root-account-create-update-8m58f\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.433432 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfms2\" (UniqueName: \"kubernetes.io/projected/41abfa9f-2f52-410c-858f-97cc71b3a8c0-kube-api-access-pfms2\") pod \"root-account-create-update-8m58f\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.535325 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41abfa9f-2f52-410c-858f-97cc71b3a8c0-operator-scripts\") pod \"root-account-create-update-8m58f\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.535387 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfms2\" (UniqueName: \"kubernetes.io/projected/41abfa9f-2f52-410c-858f-97cc71b3a8c0-kube-api-access-pfms2\") pod \"root-account-create-update-8m58f\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.536327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41abfa9f-2f52-410c-858f-97cc71b3a8c0-operator-scripts\") pod \"root-account-create-update-8m58f\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.556598 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfms2\" (UniqueName: \"kubernetes.io/projected/41abfa9f-2f52-410c-858f-97cc71b3a8c0-kube-api-access-pfms2\") pod \"root-account-create-update-8m58f\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:05 crc kubenswrapper[4921]: I0318 13:31:05.646755 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:06 crc kubenswrapper[4921]: I0318 13:31:06.088183 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8m58f"] Mar 18 13:31:06 crc kubenswrapper[4921]: I0318 13:31:06.361548 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8m58f" event={"ID":"41abfa9f-2f52-410c-858f-97cc71b3a8c0","Type":"ContainerStarted","Data":"999a001bf1cf2790431cf4fa3c63db2079cec4287c15b298e6db0c8dcd50acf8"} Mar 18 13:31:06 crc kubenswrapper[4921]: I0318 13:31:06.361976 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8m58f" event={"ID":"41abfa9f-2f52-410c-858f-97cc71b3a8c0","Type":"ContainerStarted","Data":"f965aadfa50338f47048c3c430c1cd35d58f78e57d5184ac379b4c96fad53c5a"} Mar 18 13:31:06 crc kubenswrapper[4921]: I0318 13:31:06.379173 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-8m58f" podStartSLOduration=1.3791499759999999 podStartE2EDuration="1.379149976s" podCreationTimestamp="2026-03-18 13:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:31:06.375582094 +0000 UTC m=+4885.925502743" watchObservedRunningTime="2026-03-18 13:31:06.379149976 +0000 UTC m=+4885.929070615" Mar 18 13:31:07 crc kubenswrapper[4921]: I0318 13:31:07.181041 4921 scope.go:117] "RemoveContainer" containerID="1c0c6193aab0386358ee7e55c2dd157892b4ee764993642fbd43d3ea83d8156d" Mar 18 13:31:07 crc kubenswrapper[4921]: I0318 13:31:07.371877 4921 generic.go:334] "Generic (PLEG): container finished" podID="41abfa9f-2f52-410c-858f-97cc71b3a8c0" containerID="999a001bf1cf2790431cf4fa3c63db2079cec4287c15b298e6db0c8dcd50acf8" exitCode=0 Mar 18 13:31:07 crc kubenswrapper[4921]: I0318 13:31:07.371934 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8m58f" event={"ID":"41abfa9f-2f52-410c-858f-97cc71b3a8c0","Type":"ContainerDied","Data":"999a001bf1cf2790431cf4fa3c63db2079cec4287c15b298e6db0c8dcd50acf8"} Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.379598 4921 generic.go:334] "Generic (PLEG): container finished" podID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerID="e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa" exitCode=0 Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.379677 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d66395fa-b434-4b08-ae7f-d061fd4fd559","Type":"ContainerDied","Data":"e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa"} Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.382551 4921 generic.go:334] "Generic (PLEG): container finished" podID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerID="244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70" exitCode=0 Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.382638 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e32df9bf-c9fd-447c-ac1c-b13028f2e12d","Type":"ContainerDied","Data":"244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70"} Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.648283 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.789763 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41abfa9f-2f52-410c-858f-97cc71b3a8c0-operator-scripts\") pod \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.790003 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfms2\" (UniqueName: \"kubernetes.io/projected/41abfa9f-2f52-410c-858f-97cc71b3a8c0-kube-api-access-pfms2\") pod \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\" (UID: \"41abfa9f-2f52-410c-858f-97cc71b3a8c0\") " Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.790657 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41abfa9f-2f52-410c-858f-97cc71b3a8c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41abfa9f-2f52-410c-858f-97cc71b3a8c0" (UID: "41abfa9f-2f52-410c-858f-97cc71b3a8c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.795030 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41abfa9f-2f52-410c-858f-97cc71b3a8c0-kube-api-access-pfms2" (OuterVolumeSpecName: "kube-api-access-pfms2") pod "41abfa9f-2f52-410c-858f-97cc71b3a8c0" (UID: "41abfa9f-2f52-410c-858f-97cc71b3a8c0"). InnerVolumeSpecName "kube-api-access-pfms2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.895566 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfms2\" (UniqueName: \"kubernetes.io/projected/41abfa9f-2f52-410c-858f-97cc71b3a8c0-kube-api-access-pfms2\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:08 crc kubenswrapper[4921]: I0318 13:31:08.895622 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41abfa9f-2f52-410c-858f-97cc71b3a8c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.394612 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8m58f" event={"ID":"41abfa9f-2f52-410c-858f-97cc71b3a8c0","Type":"ContainerDied","Data":"f965aadfa50338f47048c3c430c1cd35d58f78e57d5184ac379b4c96fad53c5a"} Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.395032 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f965aadfa50338f47048c3c430c1cd35d58f78e57d5184ac379b4c96fad53c5a" Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.394623 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8m58f" Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.396539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d66395fa-b434-4b08-ae7f-d061fd4fd559","Type":"ContainerStarted","Data":"e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449"} Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.396834 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.398939 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e32df9bf-c9fd-447c-ac1c-b13028f2e12d","Type":"ContainerStarted","Data":"8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860"} Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.399200 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.446941 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.446903824 podStartE2EDuration="37.446903824s" podCreationTimestamp="2026-03-18 13:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:31:09.442084386 +0000 UTC m=+4888.992005055" watchObservedRunningTime="2026-03-18 13:31:09.446903824 +0000 UTC m=+4888.996824463" Mar 18 13:31:09 crc kubenswrapper[4921]: I0318 13:31:09.480247 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.480214647 podStartE2EDuration="37.480214647s" podCreationTimestamp="2026-03-18 13:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:31:09.474993498 +0000 UTC m=+4889.024914137" watchObservedRunningTime="2026-03-18 13:31:09.480214647 +0000 UTC m=+4889.030135286" Mar 18 13:31:23 crc kubenswrapper[4921]: I0318 13:31:23.950242 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 13:31:24 crc kubenswrapper[4921]: I0318 13:31:24.045184 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:27 crc kubenswrapper[4921]: I0318 13:31:27.957269 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dwm22"] Mar 18 13:31:27 crc kubenswrapper[4921]: E0318 13:31:27.958391 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41abfa9f-2f52-410c-858f-97cc71b3a8c0" containerName="mariadb-account-create-update" Mar 18 13:31:27 crc kubenswrapper[4921]: I0318 13:31:27.958419 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="41abfa9f-2f52-410c-858f-97cc71b3a8c0" containerName="mariadb-account-create-update" Mar 18 13:31:27 crc kubenswrapper[4921]: I0318 13:31:27.958597 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="41abfa9f-2f52-410c-858f-97cc71b3a8c0" containerName="mariadb-account-create-update" Mar 18 13:31:27 crc kubenswrapper[4921]: I0318 13:31:27.959608 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:27 crc kubenswrapper[4921]: I0318 13:31:27.970243 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dwm22"] Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.097674 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrlpw\" (UniqueName: \"kubernetes.io/projected/127e0d98-4617-4732-88a5-808f6436511d-kube-api-access-jrlpw\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.098287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-config\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.098495 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.200546 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.200660 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrlpw\" (UniqueName: \"kubernetes.io/projected/127e0d98-4617-4732-88a5-808f6436511d-kube-api-access-jrlpw\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.200692 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-config\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.201806 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.202281 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-config\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.225150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrlpw\" (UniqueName: \"kubernetes.io/projected/127e0d98-4617-4732-88a5-808f6436511d-kube-api-access-jrlpw\") pod \"dnsmasq-dns-5b7946d7b9-dwm22\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.284793 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.710938 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dwm22"] Mar 18 13:31:28 crc kubenswrapper[4921]: I0318 13:31:28.848959 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:31:29 crc kubenswrapper[4921]: I0318 13:31:29.545284 4921 generic.go:334] "Generic (PLEG): container finished" podID="127e0d98-4617-4732-88a5-808f6436511d" containerID="e4d92b0e2461a07eb022a09429ebe31489e94675bcc91be4d67e861e997f88a5" exitCode=0 Mar 18 13:31:29 crc kubenswrapper[4921]: I0318 13:31:29.545336 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" event={"ID":"127e0d98-4617-4732-88a5-808f6436511d","Type":"ContainerDied","Data":"e4d92b0e2461a07eb022a09429ebe31489e94675bcc91be4d67e861e997f88a5"} Mar 18 13:31:29 crc kubenswrapper[4921]: I0318 13:31:29.545536 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" event={"ID":"127e0d98-4617-4732-88a5-808f6436511d","Type":"ContainerStarted","Data":"aa8e52c7ff55181041819ed101b2c12d822cb5414f7c6cf6863277e047a58d44"} Mar 18 13:31:29 crc kubenswrapper[4921]: I0318 13:31:29.804133 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:30 crc kubenswrapper[4921]: I0318 13:31:30.558428 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" event={"ID":"127e0d98-4617-4732-88a5-808f6436511d","Type":"ContainerStarted","Data":"7724fbe91b20cafc1927234f3f92193f3fd6024243b4839596a1bb2b704fc489"} Mar 18 13:31:30 crc kubenswrapper[4921]: I0318 13:31:30.559707 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:30 crc kubenswrapper[4921]: I0318 13:31:30.582057 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" podStartSLOduration=3.5820366249999998 podStartE2EDuration="3.582036625s" podCreationTimestamp="2026-03-18 13:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:31:30.577382722 +0000 UTC m=+4910.127303361" watchObservedRunningTime="2026-03-18 13:31:30.582036625 +0000 UTC m=+4910.131957264" Mar 18 13:31:30 crc kubenswrapper[4921]: I0318 13:31:30.698456 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="rabbitmq" containerID="cri-o://e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449" gracePeriod=604799 Mar 18 13:31:31 crc kubenswrapper[4921]: I0318 13:31:31.685290 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="rabbitmq" containerID="cri-o://8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860" gracePeriod=604799 Mar 18 13:31:33 crc kubenswrapper[4921]: I0318 13:31:33.948644 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.9:5672: connect: connection refused" Mar 18 13:31:34 crc kubenswrapper[4921]: I0318 13:31:34.030903 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.10:5672: connect: connection refused" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.260775 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436293 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-erlang-cookie\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436386 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-plugins-conf\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436420 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d66395fa-b434-4b08-ae7f-d061fd4fd559-pod-info\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436451 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76mg9\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-kube-api-access-76mg9\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436477 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-confd\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436611 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436635 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-plugins\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436679 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d66395fa-b434-4b08-ae7f-d061fd4fd559-erlang-cookie-secret\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.436700 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-server-conf\") pod \"d66395fa-b434-4b08-ae7f-d061fd4fd559\" (UID: \"d66395fa-b434-4b08-ae7f-d061fd4fd559\") " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.438523 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.438803 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.439185 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.443325 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-kube-api-access-76mg9" (OuterVolumeSpecName: "kube-api-access-76mg9") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "kube-api-access-76mg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.443803 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d66395fa-b434-4b08-ae7f-d061fd4fd559-pod-info" (OuterVolumeSpecName: "pod-info") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.452250 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66395fa-b434-4b08-ae7f-d061fd4fd559-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.470184 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199" (OuterVolumeSpecName: "persistence") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "pvc-33325264-6305-4ac3-ad76-de9730093199". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.470303 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-server-conf" (OuterVolumeSpecName: "server-conf") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.529703 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d66395fa-b434-4b08-ae7f-d061fd4fd559" (UID: "d66395fa-b434-4b08-ae7f-d061fd4fd559"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538687 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538755 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") on node \"crc\" " Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538773 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538789 4921 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d66395fa-b434-4b08-ae7f-d061fd4fd559-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538799 4921 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538808 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d66395fa-b434-4b08-ae7f-d061fd4fd559-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538817 4921 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d66395fa-b434-4b08-ae7f-d061fd4fd559-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538825 4921 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d66395fa-b434-4b08-ae7f-d061fd4fd559-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.538835 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76mg9\" (UniqueName: \"kubernetes.io/projected/d66395fa-b434-4b08-ae7f-d061fd4fd559-kube-api-access-76mg9\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.555838 4921 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.556081 4921 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-33325264-6305-4ac3-ad76-de9730093199" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199") on node "crc" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.608820 4921 generic.go:334] "Generic (PLEG): container finished" podID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerID="e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449" exitCode=0 Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.608887 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d66395fa-b434-4b08-ae7f-d061fd4fd559","Type":"ContainerDied","Data":"e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449"} Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.608933 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d66395fa-b434-4b08-ae7f-d061fd4fd559","Type":"ContainerDied","Data":"d52f8688d964922a484d83e265947bd859c388f3436cc677fff11cc06259df07"} Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.608956 4921 scope.go:117] "RemoveContainer" containerID="e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.608971 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.629000 4921 scope.go:117] "RemoveContainer" containerID="e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.641039 4921 reconciler_common.go:293] "Volume detached for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.653590 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.656268 4921 scope.go:117] "RemoveContainer" containerID="e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449" Mar 18 13:31:37 crc kubenswrapper[4921]: E0318 13:31:37.656705 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449\": container with ID starting with e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449 not found: ID does not exist" containerID="e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.656737 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449"} err="failed to get container status \"e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449\": rpc error: code = NotFound desc = could not find container \"e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449\": container with ID starting with e5a9566e91d0b69803433aef9adebfe229c1a0f478d826880f8a25f39d4d2449 not found: ID does not exist" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.656758 4921 scope.go:117] "RemoveContainer" containerID="e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa" Mar 18 13:31:37 crc kubenswrapper[4921]: E0318 13:31:37.657004 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa\": container with ID starting with e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa not found: ID does not exist" containerID="e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.657037 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa"} err="failed to get container status \"e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa\": rpc error: code = NotFound desc = could not find container \"e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa\": container with ID starting with e5ce31aac25001a4aeeb0b3ec9172612292713eaeee5fa1a150992c16e4d06aa not found: ID does not exist" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.662602 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.678567 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:31:37 crc kubenswrapper[4921]: E0318 13:31:37.678901 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="rabbitmq" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.678926 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="rabbitmq" Mar 18 13:31:37 crc kubenswrapper[4921]: E0318 13:31:37.678947 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="setup-container" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.678956 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="setup-container" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.679201 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" containerName="rabbitmq" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.680220 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.682158 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.682364 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.682828 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.683035 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tmcsw" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.683222 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.694226 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845039 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1565cc76-b9f3-4dbd-b130-bcd4096db6da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845144 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1565cc76-b9f3-4dbd-b130-bcd4096db6da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845170 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1565cc76-b9f3-4dbd-b130-bcd4096db6da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845195 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fq56\" (UniqueName: \"kubernetes.io/projected/1565cc76-b9f3-4dbd-b130-bcd4096db6da-kube-api-access-5fq56\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845224 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845257 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845319 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.845351 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1565cc76-b9f3-4dbd-b130-bcd4096db6da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.947039 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1565cc76-b9f3-4dbd-b130-bcd4096db6da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.947508 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.947621 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1565cc76-b9f3-4dbd-b130-bcd4096db6da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.947727 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1565cc76-b9f3-4dbd-b130-bcd4096db6da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.947818 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1565cc76-b9f3-4dbd-b130-bcd4096db6da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.947916 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fq56\" (UniqueName: \"kubernetes.io/projected/1565cc76-b9f3-4dbd-b130-bcd4096db6da-kube-api-access-5fq56\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.948204 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.948663 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.949025 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.949225 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.949074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1565cc76-b9f3-4dbd-b130-bcd4096db6da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.949594 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1565cc76-b9f3-4dbd-b130-bcd4096db6da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.949813 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.952765 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1565cc76-b9f3-4dbd-b130-bcd4096db6da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.953138 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1565cc76-b9f3-4dbd-b130-bcd4096db6da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.959356 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.959392 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ece1b5101c10ac68a0b85cec9466746f312c5b8b54289739ca02c967f81234e/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.963210 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1565cc76-b9f3-4dbd-b130-bcd4096db6da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:37 crc kubenswrapper[4921]: I0318 13:31:37.966171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fq56\" (UniqueName: \"kubernetes.io/projected/1565cc76-b9f3-4dbd-b130-bcd4096db6da-kube-api-access-5fq56\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.001311 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-33325264-6305-4ac3-ad76-de9730093199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33325264-6305-4ac3-ad76-de9730093199\") pod \"rabbitmq-server-0\" (UID: \"1565cc76-b9f3-4dbd-b130-bcd4096db6da\") " pod="openstack/rabbitmq-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.225972 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.287357 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.303015 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.348974 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-6jk6h"] Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.349270 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerName="dnsmasq-dns" containerID="cri-o://0ea3e4113e3de356fc4f546636e220a227e9c083ab51cf12f10226cf602b69cc" gracePeriod=10 Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.361936 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362028 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfkjq\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-kube-api-access-kfkjq\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362092 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-plugins\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362145 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-erlang-cookie\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362177 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-server-conf\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362228 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-erlang-cookie-secret\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362255 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-confd\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362298 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-pod-info\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.362403 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-plugins-conf\") pod \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\" (UID: \"e32df9bf-c9fd-447c-ac1c-b13028f2e12d\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.369837 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.370941 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.371665 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-pod-info" (OuterVolumeSpecName: "pod-info") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.371759 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.372070 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.390629 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-kube-api-access-kfkjq" (OuterVolumeSpecName: "kube-api-access-kfkjq") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "kube-api-access-kfkjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.396878 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-server-conf" (OuterVolumeSpecName: "server-conf") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.405078 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2" (OuterVolumeSpecName: "persistence") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "pvc-48262dea-eb16-4838-b670-e52aa07d13f2". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.471099 4921 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472218 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") on node \"crc\" " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472240 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfkjq\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-kube-api-access-kfkjq\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472257 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472270 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472284 4921 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472297 4921 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.472308 4921 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.486472 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e32df9bf-c9fd-447c-ac1c-b13028f2e12d" (UID: "e32df9bf-c9fd-447c-ac1c-b13028f2e12d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.500755 4921 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.500985 4921 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-48262dea-eb16-4838-b670-e52aa07d13f2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2") on node "crc" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.575875 4921 reconciler_common.go:293] "Volume detached for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.575944 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e32df9bf-c9fd-447c-ac1c-b13028f2e12d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.622593 4921 generic.go:334] "Generic (PLEG): container finished" podID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerID="8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860" exitCode=0 Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.622668 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e32df9bf-c9fd-447c-ac1c-b13028f2e12d","Type":"ContainerDied","Data":"8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860"} Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.622677 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.622698 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e32df9bf-c9fd-447c-ac1c-b13028f2e12d","Type":"ContainerDied","Data":"41eaa1cca6163a8361465a9b4a12478ff3c517170a43426132e59d7841a44d00"} Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.622716 4921 scope.go:117] "RemoveContainer" containerID="8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.627285 4921 generic.go:334] "Generic (PLEG): container finished" podID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerID="0ea3e4113e3de356fc4f546636e220a227e9c083ab51cf12f10226cf602b69cc" exitCode=0 Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.627321 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" event={"ID":"c90674ee-98de-43c5-9346-e4a0d002bcfc","Type":"ContainerDied","Data":"0ea3e4113e3de356fc4f546636e220a227e9c083ab51cf12f10226cf602b69cc"} Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.655442 4921 scope.go:117] "RemoveContainer" containerID="244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.677649 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.700071 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.707883 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:38 crc kubenswrapper[4921]: E0318 13:31:38.708250 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="setup-container" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.708263 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="setup-container" Mar 18 13:31:38 crc kubenswrapper[4921]: E0318 13:31:38.708279 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="rabbitmq" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.708286 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="rabbitmq" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.708429 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" containerName="rabbitmq" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.709227 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.719941 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.720625 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.723910 4921 scope.go:117] "RemoveContainer" containerID="8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.725036 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.725134 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 13:31:38 crc kubenswrapper[4921]: E0318 13:31:38.726509 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860\": container with ID starting with 8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860 not found: ID does not exist" containerID="8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.726792 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860"} err="failed to get container status \"8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860\": rpc error: code = NotFound desc = could not find container \"8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860\": container with ID starting with 8c1784a04d316690fffd1be7e2bc33020c645f0c635c0a09157f279ccff37860 not found: ID does not exist" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.727036 4921 scope.go:117] "RemoveContainer" containerID="244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70" Mar 18 13:31:38 crc kubenswrapper[4921]: E0318 13:31:38.728585 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70\": container with ID starting with 244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70 not found: ID does not exist" containerID="244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.728910 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70"} err="failed to get container status \"244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70\": rpc error: code = NotFound desc = could not find container \"244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70\": container with ID starting with 244bfc100cd9293da5ddf492831a6e0cac77b3b2649531fb0707148e29dc8a70 not found: ID does not exist" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.740489 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ldptp" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.746204 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.854335 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.864095 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.885723 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/750e5280-bd0c-45da-92c4-4f420995780d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.887971 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncptv\" (UniqueName: \"kubernetes.io/projected/750e5280-bd0c-45da-92c4-4f420995780d-kube-api-access-ncptv\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.888040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.888138 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/750e5280-bd0c-45da-92c4-4f420995780d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.888260 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/750e5280-bd0c-45da-92c4-4f420995780d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.888299 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.890294 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.890380 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.890451 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/750e5280-bd0c-45da-92c4-4f420995780d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992052 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-config\") pod \"c90674ee-98de-43c5-9346-e4a0d002bcfc\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992178 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-dns-svc\") pod \"c90674ee-98de-43c5-9346-e4a0d002bcfc\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992205 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk5pr\" (UniqueName: \"kubernetes.io/projected/c90674ee-98de-43c5-9346-e4a0d002bcfc-kube-api-access-rk5pr\") pod \"c90674ee-98de-43c5-9346-e4a0d002bcfc\" (UID: \"c90674ee-98de-43c5-9346-e4a0d002bcfc\") " Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992399 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992449 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/750e5280-bd0c-45da-92c4-4f420995780d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992481 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/750e5280-bd0c-45da-92c4-4f420995780d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992501 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncptv\" (UniqueName: \"kubernetes.io/projected/750e5280-bd0c-45da-92c4-4f420995780d-kube-api-access-ncptv\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992547 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/750e5280-bd0c-45da-92c4-4f420995780d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992582 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/750e5280-bd0c-45da-92c4-4f420995780d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992600 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.992635 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.993738 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.993952 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/750e5280-bd0c-45da-92c4-4f420995780d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.994245 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/750e5280-bd0c-45da-92c4-4f420995780d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.994900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.995863 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:31:38 crc kubenswrapper[4921]: I0318 13:31:38.995904 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/86d5c7f1f45d520f6f366c0ada6bb6fa4bdfe9bcfe5bf864278546983f3060d6/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.007185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/750e5280-bd0c-45da-92c4-4f420995780d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.009847 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/750e5280-bd0c-45da-92c4-4f420995780d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.012380 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/750e5280-bd0c-45da-92c4-4f420995780d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.014176 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncptv\" (UniqueName: \"kubernetes.io/projected/750e5280-bd0c-45da-92c4-4f420995780d-kube-api-access-ncptv\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.014835 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90674ee-98de-43c5-9346-e4a0d002bcfc-kube-api-access-rk5pr" (OuterVolumeSpecName: "kube-api-access-rk5pr") pod "c90674ee-98de-43c5-9346-e4a0d002bcfc" (UID: "c90674ee-98de-43c5-9346-e4a0d002bcfc"). InnerVolumeSpecName "kube-api-access-rk5pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.038054 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c90674ee-98de-43c5-9346-e4a0d002bcfc" (UID: "c90674ee-98de-43c5-9346-e4a0d002bcfc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.039803 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-config" (OuterVolumeSpecName: "config") pod "c90674ee-98de-43c5-9346-e4a0d002bcfc" (UID: "c90674ee-98de-43c5-9346-e4a0d002bcfc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.049857 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-48262dea-eb16-4838-b670-e52aa07d13f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-48262dea-eb16-4838-b670-e52aa07d13f2\") pod \"rabbitmq-cell1-server-0\" (UID: \"750e5280-bd0c-45da-92c4-4f420995780d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.094040 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.094079 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c90674ee-98de-43c5-9346-e4a0d002bcfc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.094091 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk5pr\" (UniqueName: \"kubernetes.io/projected/c90674ee-98de-43c5-9346-e4a0d002bcfc-kube-api-access-rk5pr\") on node \"crc\" DevicePath \"\"" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.219201 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66395fa-b434-4b08-ae7f-d061fd4fd559" path="/var/lib/kubelet/pods/d66395fa-b434-4b08-ae7f-d061fd4fd559/volumes" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.220125 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e32df9bf-c9fd-447c-ac1c-b13028f2e12d" path="/var/lib/kubelet/pods/e32df9bf-c9fd-447c-ac1c-b13028f2e12d/volumes" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.350364 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.637959 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1565cc76-b9f3-4dbd-b130-bcd4096db6da","Type":"ContainerStarted","Data":"487c7ccb2c636ca05d81f27cb90a51664b5a36ac2d6b63c032da3579560a625e"} Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.639939 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" event={"ID":"c90674ee-98de-43c5-9346-e4a0d002bcfc","Type":"ContainerDied","Data":"e38b2540adfe31883fdcf4f86bf87fdc2e655c5f87653a22cbc7be168ffc1b01"} Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.639989 4921 scope.go:117] "RemoveContainer" containerID="0ea3e4113e3de356fc4f546636e220a227e9c083ab51cf12f10226cf602b69cc" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.639998 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-6jk6h" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.661159 4921 scope.go:117] "RemoveContainer" containerID="494bc219cf6a6f975ab96a4222a5629103e78ce646d1bc6f7e2da8e910b77cf8" Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.663965 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-6jk6h"] Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.669377 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-6jk6h"] Mar 18 13:31:39 crc kubenswrapper[4921]: I0318 13:31:39.801909 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 13:31:40 crc kubenswrapper[4921]: I0318 13:31:40.650561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1565cc76-b9f3-4dbd-b130-bcd4096db6da","Type":"ContainerStarted","Data":"d5eb66d56b55cf43264e5d3ce0fc9c58623d409138d2e9d97799b77b4400db27"} Mar 18 13:31:40 crc kubenswrapper[4921]: I0318 13:31:40.651822 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"750e5280-bd0c-45da-92c4-4f420995780d","Type":"ContainerStarted","Data":"bcd94454a0b6f2676bc438a0d1f7474ccf263a85ddb042f8a95e404b9e10543c"} Mar 18 13:31:41 crc kubenswrapper[4921]: I0318 13:31:41.232830 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" path="/var/lib/kubelet/pods/c90674ee-98de-43c5-9346-e4a0d002bcfc/volumes" Mar 18 13:31:41 crc kubenswrapper[4921]: I0318 13:31:41.661061 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"750e5280-bd0c-45da-92c4-4f420995780d","Type":"ContainerStarted","Data":"6377c838aa0d02436548b781aa164a145cdeccc95489bf861d015277c58bea57"} Mar 18 13:31:47 crc kubenswrapper[4921]: I0318 13:31:47.081088 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:31:47 crc kubenswrapper[4921]: I0318 13:31:47.082608 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.147839 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564012-dg84n"] Mar 18 13:32:00 crc kubenswrapper[4921]: E0318 13:32:00.149426 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerName="dnsmasq-dns" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.149450 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerName="dnsmasq-dns" Mar 18 13:32:00 crc kubenswrapper[4921]: E0318 13:32:00.149486 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerName="init" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.149495 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerName="init" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.149685 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90674ee-98de-43c5-9346-e4a0d002bcfc" containerName="dnsmasq-dns" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.150442 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.153479 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.153964 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.155517 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.167308 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-dg84n"] Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.247897 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/e0681c9b-038a-4c94-b298-fb82286dc29b-kube-api-access-5nfz7\") pod \"auto-csr-approver-29564012-dg84n\" (UID: \"e0681c9b-038a-4c94-b298-fb82286dc29b\") " pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.348972 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/e0681c9b-038a-4c94-b298-fb82286dc29b-kube-api-access-5nfz7\") pod \"auto-csr-approver-29564012-dg84n\" (UID: \"e0681c9b-038a-4c94-b298-fb82286dc29b\") " pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.371835 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/e0681c9b-038a-4c94-b298-fb82286dc29b-kube-api-access-5nfz7\") pod \"auto-csr-approver-29564012-dg84n\" (UID: \"e0681c9b-038a-4c94-b298-fb82286dc29b\") " pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.477086 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:00 crc kubenswrapper[4921]: I0318 13:32:00.894129 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-dg84n"] Mar 18 13:32:01 crc kubenswrapper[4921]: I0318 13:32:01.807023 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-dg84n" event={"ID":"e0681c9b-038a-4c94-b298-fb82286dc29b","Type":"ContainerStarted","Data":"0146ac65330166ff7ffbe05f9af2bf7a45655cd7b731294e7133e735affd3e30"} Mar 18 13:32:02 crc kubenswrapper[4921]: I0318 13:32:02.820587 4921 generic.go:334] "Generic (PLEG): container finished" podID="e0681c9b-038a-4c94-b298-fb82286dc29b" containerID="94460333591e51847ec87bdbc357debb0ae15eec39e0163f3d5cecdbd91251d1" exitCode=0 Mar 18 13:32:02 crc kubenswrapper[4921]: I0318 13:32:02.820814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-dg84n" event={"ID":"e0681c9b-038a-4c94-b298-fb82286dc29b","Type":"ContainerDied","Data":"94460333591e51847ec87bdbc357debb0ae15eec39e0163f3d5cecdbd91251d1"} Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.131105 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.240764 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/e0681c9b-038a-4c94-b298-fb82286dc29b-kube-api-access-5nfz7\") pod \"e0681c9b-038a-4c94-b298-fb82286dc29b\" (UID: \"e0681c9b-038a-4c94-b298-fb82286dc29b\") " Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.251650 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0681c9b-038a-4c94-b298-fb82286dc29b-kube-api-access-5nfz7" (OuterVolumeSpecName: "kube-api-access-5nfz7") pod "e0681c9b-038a-4c94-b298-fb82286dc29b" (UID: "e0681c9b-038a-4c94-b298-fb82286dc29b"). InnerVolumeSpecName "kube-api-access-5nfz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.342379 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nfz7\" (UniqueName: \"kubernetes.io/projected/e0681c9b-038a-4c94-b298-fb82286dc29b-kube-api-access-5nfz7\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.838016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-dg84n" event={"ID":"e0681c9b-038a-4c94-b298-fb82286dc29b","Type":"ContainerDied","Data":"0146ac65330166ff7ffbe05f9af2bf7a45655cd7b731294e7133e735affd3e30"} Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.838068 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0146ac65330166ff7ffbe05f9af2bf7a45655cd7b731294e7133e735affd3e30" Mar 18 13:32:04 crc kubenswrapper[4921]: I0318 13:32:04.838145 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-dg84n" Mar 18 13:32:05 crc kubenswrapper[4921]: I0318 13:32:05.204412 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-4dwvr"] Mar 18 13:32:05 crc kubenswrapper[4921]: I0318 13:32:05.223075 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-4dwvr"] Mar 18 13:32:07 crc kubenswrapper[4921]: I0318 13:32:07.223415 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34af6292-66b5-4f8d-8b9e-bf5d13acfa99" path="/var/lib/kubelet/pods/34af6292-66b5-4f8d-8b9e-bf5d13acfa99/volumes" Mar 18 13:32:07 crc kubenswrapper[4921]: I0318 13:32:07.265677 4921 scope.go:117] "RemoveContainer" containerID="1ede5f2894b83a5bb9aab68dadb4c0cb0df72ffec8a90be8ef60958bf20b45f3" Mar 18 13:32:12 crc kubenswrapper[4921]: I0318 13:32:12.902775 4921 generic.go:334] "Generic (PLEG): container finished" podID="1565cc76-b9f3-4dbd-b130-bcd4096db6da" containerID="d5eb66d56b55cf43264e5d3ce0fc9c58623d409138d2e9d97799b77b4400db27" exitCode=0 Mar 18 13:32:12 crc kubenswrapper[4921]: I0318 13:32:12.902865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1565cc76-b9f3-4dbd-b130-bcd4096db6da","Type":"ContainerDied","Data":"d5eb66d56b55cf43264e5d3ce0fc9c58623d409138d2e9d97799b77b4400db27"} Mar 18 13:32:13 crc kubenswrapper[4921]: I0318 13:32:13.913817 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1565cc76-b9f3-4dbd-b130-bcd4096db6da","Type":"ContainerStarted","Data":"539061d61d7a0e1d72d6efd4a3d1b5149045101d9ad08679d359ae652b11d370"} Mar 18 13:32:13 crc kubenswrapper[4921]: I0318 13:32:13.914545 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 13:32:13 crc kubenswrapper[4921]: I0318 13:32:13.915819 4921 generic.go:334] "Generic (PLEG): container finished" podID="750e5280-bd0c-45da-92c4-4f420995780d" containerID="6377c838aa0d02436548b781aa164a145cdeccc95489bf861d015277c58bea57" exitCode=0 Mar 18 13:32:13 crc kubenswrapper[4921]: I0318 13:32:13.915881 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"750e5280-bd0c-45da-92c4-4f420995780d","Type":"ContainerDied","Data":"6377c838aa0d02436548b781aa164a145cdeccc95489bf861d015277c58bea57"} Mar 18 13:32:13 crc kubenswrapper[4921]: I0318 13:32:13.955569 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.955524707 podStartE2EDuration="36.955524707s" podCreationTimestamp="2026-03-18 13:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:13.942592637 +0000 UTC m=+4953.492513296" watchObservedRunningTime="2026-03-18 13:32:13.955524707 +0000 UTC m=+4953.505445356" Mar 18 13:32:14 crc kubenswrapper[4921]: I0318 13:32:14.932298 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"750e5280-bd0c-45da-92c4-4f420995780d","Type":"ContainerStarted","Data":"106061abd37404c31e6ed988bcf898c7df8fb0e6d59d2c0f491a626966c96324"} Mar 18 13:32:14 crc kubenswrapper[4921]: I0318 13:32:14.933214 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:32:14 crc kubenswrapper[4921]: I0318 13:32:14.953823 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.953788814 podStartE2EDuration="36.953788814s" podCreationTimestamp="2026-03-18 13:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:14.952781155 +0000 UTC m=+4954.502701844" watchObservedRunningTime="2026-03-18 13:32:14.953788814 +0000 UTC m=+4954.503709453" Mar 18 13:32:17 crc kubenswrapper[4921]: I0318 13:32:17.081881 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:32:17 crc kubenswrapper[4921]: I0318 13:32:17.081980 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:32:28 crc kubenswrapper[4921]: I0318 13:32:28.306309 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 13:32:29 crc kubenswrapper[4921]: I0318 13:32:29.354156 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.331555 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 13:32:40 crc kubenswrapper[4921]: E0318 13:32:40.332656 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0681c9b-038a-4c94-b298-fb82286dc29b" containerName="oc" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.332675 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0681c9b-038a-4c94-b298-fb82286dc29b" containerName="oc" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.332859 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0681c9b-038a-4c94-b298-fb82286dc29b" containerName="oc" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.333542 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.336278 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jz5tc" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.342253 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.458377 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm5lv\" (UniqueName: \"kubernetes.io/projected/be4311dd-c95f-4abd-9ef0-fbd9eea604cf-kube-api-access-jm5lv\") pod \"mariadb-client\" (UID: \"be4311dd-c95f-4abd-9ef0-fbd9eea604cf\") " pod="openstack/mariadb-client" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.560017 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm5lv\" (UniqueName: \"kubernetes.io/projected/be4311dd-c95f-4abd-9ef0-fbd9eea604cf-kube-api-access-jm5lv\") pod \"mariadb-client\" (UID: \"be4311dd-c95f-4abd-9ef0-fbd9eea604cf\") " pod="openstack/mariadb-client" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.582394 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm5lv\" (UniqueName: \"kubernetes.io/projected/be4311dd-c95f-4abd-9ef0-fbd9eea604cf-kube-api-access-jm5lv\") pod \"mariadb-client\" (UID: \"be4311dd-c95f-4abd-9ef0-fbd9eea604cf\") " pod="openstack/mariadb-client" Mar 18 13:32:40 crc kubenswrapper[4921]: I0318 13:32:40.655527 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:32:41 crc kubenswrapper[4921]: I0318 13:32:41.009294 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:32:41 crc kubenswrapper[4921]: I0318 13:32:41.123965 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be4311dd-c95f-4abd-9ef0-fbd9eea604cf","Type":"ContainerStarted","Data":"138832ec0a4064cf58c6b4eee51a4c607375f2b7406295731f980bc5a1739f1a"} Mar 18 13:32:42 crc kubenswrapper[4921]: I0318 13:32:42.133164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be4311dd-c95f-4abd-9ef0-fbd9eea604cf","Type":"ContainerStarted","Data":"62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5"} Mar 18 13:32:42 crc kubenswrapper[4921]: I0318 13:32:42.149261 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.149224317 podStartE2EDuration="2.149224317s" podCreationTimestamp="2026-03-18 13:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:32:42.145779759 +0000 UTC m=+4981.695700408" watchObservedRunningTime="2026-03-18 13:32:42.149224317 +0000 UTC m=+4981.699144946" Mar 18 13:32:47 crc kubenswrapper[4921]: I0318 13:32:47.081402 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:32:47 crc kubenswrapper[4921]: I0318 13:32:47.082286 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:32:47 crc kubenswrapper[4921]: I0318 13:32:47.082351 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:32:47 crc kubenswrapper[4921]: I0318 13:32:47.083143 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:32:47 crc kubenswrapper[4921]: I0318 13:32:47.083251 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" gracePeriod=600 Mar 18 13:32:47 crc kubenswrapper[4921]: E0318 13:32:47.206079 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:32:48 crc kubenswrapper[4921]: I0318 13:32:48.177037 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" exitCode=0 Mar 18 13:32:48 crc kubenswrapper[4921]: I0318 13:32:48.177086 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d"} Mar 18 13:32:48 crc kubenswrapper[4921]: I0318 13:32:48.177167 4921 scope.go:117] "RemoveContainer" containerID="6801f03ae22e22ea24d6b66923ad809a09a2350a73e361f997823c799fdb1b50" Mar 18 13:32:48 crc kubenswrapper[4921]: I0318 13:32:48.177732 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:32:48 crc kubenswrapper[4921]: E0318 13:32:48.178023 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:32:56 crc kubenswrapper[4921]: I0318 13:32:56.716527 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:32:56 crc kubenswrapper[4921]: I0318 13:32:56.717122 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="be4311dd-c95f-4abd-9ef0-fbd9eea604cf" containerName="mariadb-client" containerID="cri-o://62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5" gracePeriod=30 Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.229032 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.248547 4921 generic.go:334] "Generic (PLEG): container finished" podID="be4311dd-c95f-4abd-9ef0-fbd9eea604cf" containerID="62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5" exitCode=143 Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.248583 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.248599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be4311dd-c95f-4abd-9ef0-fbd9eea604cf","Type":"ContainerDied","Data":"62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5"} Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.248984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"be4311dd-c95f-4abd-9ef0-fbd9eea604cf","Type":"ContainerDied","Data":"138832ec0a4064cf58c6b4eee51a4c607375f2b7406295731f980bc5a1739f1a"} Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.249011 4921 scope.go:117] "RemoveContainer" containerID="62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.271227 4921 scope.go:117] "RemoveContainer" containerID="62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5" Mar 18 13:32:57 crc kubenswrapper[4921]: E0318 13:32:57.272189 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5\": container with ID starting with 62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5 not found: ID does not exist" containerID="62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.272239 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5"} err="failed to get container status \"62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5\": rpc error: code = NotFound desc = could not find container \"62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5\": container with ID starting with 62240eaf58afe40f14d6d6e7792a1ffedf619b1cb45691ba93a1fae14e5785d5 not found: ID does not exist" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.327273 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm5lv\" (UniqueName: \"kubernetes.io/projected/be4311dd-c95f-4abd-9ef0-fbd9eea604cf-kube-api-access-jm5lv\") pod \"be4311dd-c95f-4abd-9ef0-fbd9eea604cf\" (UID: \"be4311dd-c95f-4abd-9ef0-fbd9eea604cf\") " Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.332301 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4311dd-c95f-4abd-9ef0-fbd9eea604cf-kube-api-access-jm5lv" (OuterVolumeSpecName: "kube-api-access-jm5lv") pod "be4311dd-c95f-4abd-9ef0-fbd9eea604cf" (UID: "be4311dd-c95f-4abd-9ef0-fbd9eea604cf"). InnerVolumeSpecName "kube-api-access-jm5lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.430372 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm5lv\" (UniqueName: \"kubernetes.io/projected/be4311dd-c95f-4abd-9ef0-fbd9eea604cf-kube-api-access-jm5lv\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.582836 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:32:57 crc kubenswrapper[4921]: I0318 13:32:57.589980 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:32:59 crc kubenswrapper[4921]: I0318 13:32:59.219343 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4311dd-c95f-4abd-9ef0-fbd9eea604cf" path="/var/lib/kubelet/pods/be4311dd-c95f-4abd-9ef0-fbd9eea604cf/volumes" Mar 18 13:33:02 crc kubenswrapper[4921]: I0318 13:33:02.210595 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:33:02 crc kubenswrapper[4921]: E0318 13:33:02.211483 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:33:14 crc kubenswrapper[4921]: I0318 13:33:14.209052 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:33:14 crc kubenswrapper[4921]: E0318 13:33:14.210038 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:33:25 crc kubenswrapper[4921]: I0318 13:33:25.209664 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:33:25 crc kubenswrapper[4921]: E0318 13:33:25.210694 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:33:39 crc kubenswrapper[4921]: I0318 13:33:39.209194 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:33:39 crc kubenswrapper[4921]: E0318 13:33:39.210255 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.448161 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kd7bl"] Mar 18 13:33:46 crc kubenswrapper[4921]: E0318 13:33:46.449373 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4311dd-c95f-4abd-9ef0-fbd9eea604cf" containerName="mariadb-client" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.449395 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4311dd-c95f-4abd-9ef0-fbd9eea604cf" containerName="mariadb-client" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.449604 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4311dd-c95f-4abd-9ef0-fbd9eea604cf" containerName="mariadb-client" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.450830 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.463819 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-utilities\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.464361 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrjkh\" (UniqueName: \"kubernetes.io/projected/f0b2db3b-c93a-4570-82c0-813f9f45382f-kube-api-access-xrjkh\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.464403 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-catalog-content\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.522647 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kd7bl"] Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.566097 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-utilities\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.566286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrjkh\" (UniqueName: \"kubernetes.io/projected/f0b2db3b-c93a-4570-82c0-813f9f45382f-kube-api-access-xrjkh\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.566330 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-catalog-content\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.567202 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-utilities\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.567432 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-catalog-content\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.588976 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrjkh\" (UniqueName: \"kubernetes.io/projected/f0b2db3b-c93a-4570-82c0-813f9f45382f-kube-api-access-xrjkh\") pod \"certified-operators-kd7bl\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:46 crc kubenswrapper[4921]: I0318 13:33:46.770138 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:47 crc kubenswrapper[4921]: I0318 13:33:47.345849 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kd7bl"] Mar 18 13:33:47 crc kubenswrapper[4921]: I0318 13:33:47.975329 4921 generic.go:334] "Generic (PLEG): container finished" podID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerID="c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6" exitCode=0 Mar 18 13:33:47 crc kubenswrapper[4921]: I0318 13:33:47.975380 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerDied","Data":"c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6"} Mar 18 13:33:47 crc kubenswrapper[4921]: I0318 13:33:47.975750 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerStarted","Data":"f24da9fdc04c3e99b667d1ecbe4e937d1f354d4113aecbd689d16545b2afea78"} Mar 18 13:33:47 crc kubenswrapper[4921]: I0318 13:33:47.978056 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:33:48 crc kubenswrapper[4921]: I0318 13:33:48.986382 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerStarted","Data":"f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc"} Mar 18 13:33:49 crc kubenswrapper[4921]: I0318 13:33:49.998742 4921 generic.go:334] "Generic (PLEG): container finished" podID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerID="f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc" exitCode=0 Mar 18 13:33:49 crc kubenswrapper[4921]: I0318 13:33:49.998814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerDied","Data":"f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc"} Mar 18 13:33:51 crc kubenswrapper[4921]: I0318 13:33:51.010629 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerStarted","Data":"561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602"} Mar 18 13:33:51 crc kubenswrapper[4921]: I0318 13:33:51.046040 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kd7bl" podStartSLOduration=2.426498363 podStartE2EDuration="5.046024656s" podCreationTimestamp="2026-03-18 13:33:46 +0000 UTC" firstStartedPulling="2026-03-18 13:33:47.977772393 +0000 UTC m=+5047.527693032" lastFinishedPulling="2026-03-18 13:33:50.597298686 +0000 UTC m=+5050.147219325" observedRunningTime="2026-03-18 13:33:51.04125386 +0000 UTC m=+5050.591174519" watchObservedRunningTime="2026-03-18 13:33:51.046024656 +0000 UTC m=+5050.595945295" Mar 18 13:33:54 crc kubenswrapper[4921]: I0318 13:33:54.208699 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:33:54 crc kubenswrapper[4921]: E0318 13:33:54.209176 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:33:56 crc kubenswrapper[4921]: I0318 13:33:56.771083 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:56 crc kubenswrapper[4921]: I0318 13:33:56.771590 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:56 crc kubenswrapper[4921]: I0318 13:33:56.816016 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:57 crc kubenswrapper[4921]: I0318 13:33:57.104452 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:57 crc kubenswrapper[4921]: I0318 13:33:57.157803 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kd7bl"] Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.072272 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kd7bl" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="registry-server" containerID="cri-o://561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602" gracePeriod=2 Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.482489 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.595370 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-catalog-content\") pod \"f0b2db3b-c93a-4570-82c0-813f9f45382f\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.595948 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-utilities\") pod \"f0b2db3b-c93a-4570-82c0-813f9f45382f\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.596045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrjkh\" (UniqueName: \"kubernetes.io/projected/f0b2db3b-c93a-4570-82c0-813f9f45382f-kube-api-access-xrjkh\") pod \"f0b2db3b-c93a-4570-82c0-813f9f45382f\" (UID: \"f0b2db3b-c93a-4570-82c0-813f9f45382f\") " Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.596820 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-utilities" (OuterVolumeSpecName: "utilities") pod "f0b2db3b-c93a-4570-82c0-813f9f45382f" (UID: "f0b2db3b-c93a-4570-82c0-813f9f45382f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.602705 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b2db3b-c93a-4570-82c0-813f9f45382f-kube-api-access-xrjkh" (OuterVolumeSpecName: "kube-api-access-xrjkh") pod "f0b2db3b-c93a-4570-82c0-813f9f45382f" (UID: "f0b2db3b-c93a-4570-82c0-813f9f45382f"). InnerVolumeSpecName "kube-api-access-xrjkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.697968 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrjkh\" (UniqueName: \"kubernetes.io/projected/f0b2db3b-c93a-4570-82c0-813f9f45382f-kube-api-access-xrjkh\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.698008 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:59 crc kubenswrapper[4921]: I0318 13:33:59.967338 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0b2db3b-c93a-4570-82c0-813f9f45382f" (UID: "f0b2db3b-c93a-4570-82c0-813f9f45382f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.003514 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b2db3b-c93a-4570-82c0-813f9f45382f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.082565 4921 generic.go:334] "Generic (PLEG): container finished" podID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerID="561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602" exitCode=0 Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.082605 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd7bl" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.082613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerDied","Data":"561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602"} Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.083730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd7bl" event={"ID":"f0b2db3b-c93a-4570-82c0-813f9f45382f","Type":"ContainerDied","Data":"f24da9fdc04c3e99b667d1ecbe4e937d1f354d4113aecbd689d16545b2afea78"} Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.083752 4921 scope.go:117] "RemoveContainer" containerID="561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.120149 4921 scope.go:117] "RemoveContainer" containerID="f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.144037 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kd7bl"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.156211 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kd7bl"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.167550 4921 scope.go:117] "RemoveContainer" containerID="c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.167734 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564014-xjq4t"] Mar 18 13:34:00 crc kubenswrapper[4921]: E0318 13:34:00.168316 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="extract-utilities" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.168336 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="extract-utilities" Mar 18 13:34:00 crc kubenswrapper[4921]: E0318 13:34:00.168377 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="registry-server" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.168386 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="registry-server" Mar 18 13:34:00 crc kubenswrapper[4921]: E0318 13:34:00.168403 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="extract-content" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.168414 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="extract-content" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.168623 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" containerName="registry-server" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.169399 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.172528 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.172810 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.175021 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.182746 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-xjq4t"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.209385 4921 scope.go:117] "RemoveContainer" containerID="561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602" Mar 18 13:34:00 crc kubenswrapper[4921]: E0318 13:34:00.210036 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602\": container with ID starting with 561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602 not found: ID does not exist" containerID="561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.210094 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602"} err="failed to get container status \"561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602\": rpc error: code = NotFound desc = could not find container \"561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602\": container with ID starting with 561f54f59e6e0aa8c157183d082754f20dfc2ee206c9635439ba02014f709602 not found: ID does not exist" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.210141 4921 scope.go:117] "RemoveContainer" containerID="f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc" Mar 18 13:34:00 crc kubenswrapper[4921]: E0318 13:34:00.210541 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc\": container with ID starting with f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc not found: ID does not exist" containerID="f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.210596 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc"} err="failed to get container status \"f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc\": rpc error: code = NotFound desc = could not find container \"f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc\": container with ID starting with f0701a051f765c0355b46cf662c1ade807195c191213e8402315180b76a9eadc not found: ID does not exist" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.210631 4921 scope.go:117] "RemoveContainer" containerID="c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6" Mar 18 13:34:00 crc kubenswrapper[4921]: E0318 13:34:00.211085 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6\": container with ID starting with c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6 not found: ID does not exist" containerID="c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.211195 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6"} err="failed to get container status \"c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6\": rpc error: code = NotFound desc = could not find container \"c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6\": container with ID starting with c5677ed8fab3ea672cb68e651f9743cd61443da32447212aca7b97aa9b019ef6 not found: ID does not exist" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.278886 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fvxmh"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.280797 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.285965 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvxmh"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.308025 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48fg4\" (UniqueName: \"kubernetes.io/projected/5bb3bba4-e4ce-47af-aec9-495dc8de6cde-kube-api-access-48fg4\") pod \"auto-csr-approver-29564014-xjq4t\" (UID: \"5bb3bba4-e4ce-47af-aec9-495dc8de6cde\") " pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.409949 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcn7\" (UniqueName: \"kubernetes.io/projected/a9993242-7ce4-4b94-9bad-8af21ec1576c-kube-api-access-8lcn7\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.410519 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48fg4\" (UniqueName: \"kubernetes.io/projected/5bb3bba4-e4ce-47af-aec9-495dc8de6cde-kube-api-access-48fg4\") pod \"auto-csr-approver-29564014-xjq4t\" (UID: \"5bb3bba4-e4ce-47af-aec9-495dc8de6cde\") " pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.410578 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-catalog-content\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.410684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-utilities\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.430902 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48fg4\" (UniqueName: \"kubernetes.io/projected/5bb3bba4-e4ce-47af-aec9-495dc8de6cde-kube-api-access-48fg4\") pod \"auto-csr-approver-29564014-xjq4t\" (UID: \"5bb3bba4-e4ce-47af-aec9-495dc8de6cde\") " pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.511813 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-catalog-content\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.512396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-catalog-content\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.512564 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-utilities\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.512630 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcn7\" (UniqueName: \"kubernetes.io/projected/a9993242-7ce4-4b94-9bad-8af21ec1576c-kube-api-access-8lcn7\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.513137 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-utilities\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.525581 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.530353 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcn7\" (UniqueName: \"kubernetes.io/projected/a9993242-7ce4-4b94-9bad-8af21ec1576c-kube-api-access-8lcn7\") pod \"community-operators-fvxmh\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.606338 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.873532 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z2g5n"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.875739 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.901332 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2g5n"] Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.927371 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-catalog-content\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.927860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bs4c\" (UniqueName: \"kubernetes.io/projected/bb569385-979b-4451-b835-ad47df97cd88-kube-api-access-4bs4c\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:00 crc kubenswrapper[4921]: I0318 13:34:00.927942 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-utilities\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.029664 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-catalog-content\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.029736 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bs4c\" (UniqueName: \"kubernetes.io/projected/bb569385-979b-4451-b835-ad47df97cd88-kube-api-access-4bs4c\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.029779 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-utilities\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.030352 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-utilities\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.030582 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-catalog-content\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.053261 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bs4c\" (UniqueName: \"kubernetes.io/projected/bb569385-979b-4451-b835-ad47df97cd88-kube-api-access-4bs4c\") pod \"redhat-marketplace-z2g5n\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.089959 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-xjq4t"] Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.215944 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.220969 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b2db3b-c93a-4570-82c0-813f9f45382f" path="/var/lib/kubelet/pods/f0b2db3b-c93a-4570-82c0-813f9f45382f/volumes" Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.267665 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvxmh"] Mar 18 13:34:01 crc kubenswrapper[4921]: W0318 13:34:01.269783 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9993242_7ce4_4b94_9bad_8af21ec1576c.slice/crio-8f7295f90123ea809e3870d29188772bc26d161ea85c52ab8a9089c97c499a64 WatchSource:0}: Error finding container 8f7295f90123ea809e3870d29188772bc26d161ea85c52ab8a9089c97c499a64: Status 404 returned error can't find the container with id 8f7295f90123ea809e3870d29188772bc26d161ea85c52ab8a9089c97c499a64 Mar 18 13:34:01 crc kubenswrapper[4921]: W0318 13:34:01.687453 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb569385_979b_4451_b835_ad47df97cd88.slice/crio-47c2021f653f519e5538969a3ac188874547551caeee75c685e60bb4795d915e WatchSource:0}: Error finding container 47c2021f653f519e5538969a3ac188874547551caeee75c685e60bb4795d915e: Status 404 returned error can't find the container with id 47c2021f653f519e5538969a3ac188874547551caeee75c685e60bb4795d915e Mar 18 13:34:01 crc kubenswrapper[4921]: I0318 13:34:01.687521 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2g5n"] Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.107332 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" event={"ID":"5bb3bba4-e4ce-47af-aec9-495dc8de6cde","Type":"ContainerStarted","Data":"a4b650dbb20326f3dfca08feca46f7e582436d65814c112e234d488da6aedb1f"} Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.109732 4921 generic.go:334] "Generic (PLEG): container finished" podID="bb569385-979b-4451-b835-ad47df97cd88" containerID="242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178" exitCode=0 Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.109803 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2g5n" event={"ID":"bb569385-979b-4451-b835-ad47df97cd88","Type":"ContainerDied","Data":"242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178"} Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.109820 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2g5n" event={"ID":"bb569385-979b-4451-b835-ad47df97cd88","Type":"ContainerStarted","Data":"47c2021f653f519e5538969a3ac188874547551caeee75c685e60bb4795d915e"} Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.111184 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerID="7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c" exitCode=0 Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.111226 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerDied","Data":"7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c"} Mar 18 13:34:02 crc kubenswrapper[4921]: I0318 13:34:02.111254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerStarted","Data":"8f7295f90123ea809e3870d29188772bc26d161ea85c52ab8a9089c97c499a64"} Mar 18 13:34:03 crc kubenswrapper[4921]: I0318 13:34:03.126817 4921 generic.go:334] "Generic (PLEG): container finished" podID="5bb3bba4-e4ce-47af-aec9-495dc8de6cde" containerID="cbc676dfaad8711a3030141aec29f84fb4465d674115228622d662c85f6b370d" exitCode=0 Mar 18 13:34:03 crc kubenswrapper[4921]: I0318 13:34:03.126898 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" event={"ID":"5bb3bba4-e4ce-47af-aec9-495dc8de6cde","Type":"ContainerDied","Data":"cbc676dfaad8711a3030141aec29f84fb4465d674115228622d662c85f6b370d"} Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.136422 4921 generic.go:334] "Generic (PLEG): container finished" podID="bb569385-979b-4451-b835-ad47df97cd88" containerID="260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582" exitCode=0 Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.136503 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2g5n" event={"ID":"bb569385-979b-4451-b835-ad47df97cd88","Type":"ContainerDied","Data":"260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582"} Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.139473 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerStarted","Data":"78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755"} Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.777805 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.801019 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48fg4\" (UniqueName: \"kubernetes.io/projected/5bb3bba4-e4ce-47af-aec9-495dc8de6cde-kube-api-access-48fg4\") pod \"5bb3bba4-e4ce-47af-aec9-495dc8de6cde\" (UID: \"5bb3bba4-e4ce-47af-aec9-495dc8de6cde\") " Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.807604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb3bba4-e4ce-47af-aec9-495dc8de6cde-kube-api-access-48fg4" (OuterVolumeSpecName: "kube-api-access-48fg4") pod "5bb3bba4-e4ce-47af-aec9-495dc8de6cde" (UID: "5bb3bba4-e4ce-47af-aec9-495dc8de6cde"). InnerVolumeSpecName "kube-api-access-48fg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:04 crc kubenswrapper[4921]: I0318 13:34:04.903518 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48fg4\" (UniqueName: \"kubernetes.io/projected/5bb3bba4-e4ce-47af-aec9-495dc8de6cde-kube-api-access-48fg4\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.147755 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" event={"ID":"5bb3bba4-e4ce-47af-aec9-495dc8de6cde","Type":"ContainerDied","Data":"a4b650dbb20326f3dfca08feca46f7e582436d65814c112e234d488da6aedb1f"} Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.147782 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-xjq4t" Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.147800 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b650dbb20326f3dfca08feca46f7e582436d65814c112e234d488da6aedb1f" Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.150583 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerID="78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755" exitCode=0 Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.150645 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerDied","Data":"78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755"} Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.850979 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-pchdh"] Mar 18 13:34:05 crc kubenswrapper[4921]: I0318 13:34:05.858238 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-pchdh"] Mar 18 13:34:06 crc kubenswrapper[4921]: I0318 13:34:06.161171 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2g5n" event={"ID":"bb569385-979b-4451-b835-ad47df97cd88","Type":"ContainerStarted","Data":"2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77"} Mar 18 13:34:06 crc kubenswrapper[4921]: I0318 13:34:06.163714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerStarted","Data":"8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705"} Mar 18 13:34:06 crc kubenswrapper[4921]: I0318 13:34:06.182634 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z2g5n" podStartSLOduration=3.297097237 podStartE2EDuration="6.182612904s" podCreationTimestamp="2026-03-18 13:34:00 +0000 UTC" firstStartedPulling="2026-03-18 13:34:02.111485439 +0000 UTC m=+5061.661406078" lastFinishedPulling="2026-03-18 13:34:04.997001106 +0000 UTC m=+5064.546921745" observedRunningTime="2026-03-18 13:34:06.180243596 +0000 UTC m=+5065.730164255" watchObservedRunningTime="2026-03-18 13:34:06.182612904 +0000 UTC m=+5065.732533543" Mar 18 13:34:06 crc kubenswrapper[4921]: I0318 13:34:06.201476 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fvxmh" podStartSLOduration=2.608493448 podStartE2EDuration="6.201453701s" podCreationTimestamp="2026-03-18 13:34:00 +0000 UTC" firstStartedPulling="2026-03-18 13:34:02.11291129 +0000 UTC m=+5061.662831929" lastFinishedPulling="2026-03-18 13:34:05.705871533 +0000 UTC m=+5065.255792182" observedRunningTime="2026-03-18 13:34:06.20035688 +0000 UTC m=+5065.750277539" watchObservedRunningTime="2026-03-18 13:34:06.201453701 +0000 UTC m=+5065.751374350" Mar 18 13:34:07 crc kubenswrapper[4921]: I0318 13:34:07.217911 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9152eddf-1c76-4e8f-a9a7-edee46f7c8e7" path="/var/lib/kubelet/pods/9152eddf-1c76-4e8f-a9a7-edee46f7c8e7/volumes" Mar 18 13:34:07 crc kubenswrapper[4921]: I0318 13:34:07.402763 4921 scope.go:117] "RemoveContainer" containerID="7f6606f7b853c58e72012a0fbe3a6466b5e4e2f95e2cd52ae1d264e058a92f55" Mar 18 13:34:07 crc kubenswrapper[4921]: I0318 13:34:07.424788 4921 scope.go:117] "RemoveContainer" containerID="8e77ac4016b30e9544bbb440dcfa7597adad767976d604498bf3a6c63197cb84" Mar 18 13:34:08 crc kubenswrapper[4921]: I0318 13:34:08.209382 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:34:08 crc kubenswrapper[4921]: E0318 13:34:08.209867 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:34:10 crc kubenswrapper[4921]: I0318 13:34:10.607726 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:10 crc kubenswrapper[4921]: I0318 13:34:10.608349 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:10 crc kubenswrapper[4921]: I0318 13:34:10.648390 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:11 crc kubenswrapper[4921]: I0318 13:34:11.219887 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:11 crc kubenswrapper[4921]: I0318 13:34:11.219937 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:11 crc kubenswrapper[4921]: I0318 13:34:11.243999 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:11 crc kubenswrapper[4921]: I0318 13:34:11.261513 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:11 crc kubenswrapper[4921]: I0318 13:34:11.299656 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvxmh"] Mar 18 13:34:12 crc kubenswrapper[4921]: I0318 13:34:12.253879 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:12 crc kubenswrapper[4921]: I0318 13:34:12.886949 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2g5n"] Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.215435 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fvxmh" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="registry-server" containerID="cri-o://8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705" gracePeriod=2 Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.623638 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.705207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lcn7\" (UniqueName: \"kubernetes.io/projected/a9993242-7ce4-4b94-9bad-8af21ec1576c-kube-api-access-8lcn7\") pod \"a9993242-7ce4-4b94-9bad-8af21ec1576c\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.705362 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-catalog-content\") pod \"a9993242-7ce4-4b94-9bad-8af21ec1576c\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.705405 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-utilities\") pod \"a9993242-7ce4-4b94-9bad-8af21ec1576c\" (UID: \"a9993242-7ce4-4b94-9bad-8af21ec1576c\") " Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.706435 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-utilities" (OuterVolumeSpecName: "utilities") pod "a9993242-7ce4-4b94-9bad-8af21ec1576c" (UID: "a9993242-7ce4-4b94-9bad-8af21ec1576c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.712421 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9993242-7ce4-4b94-9bad-8af21ec1576c-kube-api-access-8lcn7" (OuterVolumeSpecName: "kube-api-access-8lcn7") pod "a9993242-7ce4-4b94-9bad-8af21ec1576c" (UID: "a9993242-7ce4-4b94-9bad-8af21ec1576c"). InnerVolumeSpecName "kube-api-access-8lcn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.767221 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9993242-7ce4-4b94-9bad-8af21ec1576c" (UID: "a9993242-7ce4-4b94-9bad-8af21ec1576c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.808283 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lcn7\" (UniqueName: \"kubernetes.io/projected/a9993242-7ce4-4b94-9bad-8af21ec1576c-kube-api-access-8lcn7\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.809357 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:13 crc kubenswrapper[4921]: I0318 13:34:13.809410 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9993242-7ce4-4b94-9bad-8af21ec1576c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.226333 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerID="8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705" exitCode=0 Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.226375 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerDied","Data":"8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705"} Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.226434 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvxmh" event={"ID":"a9993242-7ce4-4b94-9bad-8af21ec1576c","Type":"ContainerDied","Data":"8f7295f90123ea809e3870d29188772bc26d161ea85c52ab8a9089c97c499a64"} Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.226433 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvxmh" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.226487 4921 scope.go:117] "RemoveContainer" containerID="8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.226555 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z2g5n" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="registry-server" containerID="cri-o://2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77" gracePeriod=2 Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.264473 4921 scope.go:117] "RemoveContainer" containerID="78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.269243 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvxmh"] Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.276317 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fvxmh"] Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.380547 4921 scope.go:117] "RemoveContainer" containerID="7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.418222 4921 scope.go:117] "RemoveContainer" containerID="8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705" Mar 18 13:34:14 crc kubenswrapper[4921]: E0318 13:34:14.418850 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705\": container with ID starting with 8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705 not found: ID does not exist" containerID="8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.418894 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705"} err="failed to get container status \"8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705\": rpc error: code = NotFound desc = could not find container \"8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705\": container with ID starting with 8adb61a9062c518708f90bf010c1017cbb32bcd5129cbcd774ac82853b835705 not found: ID does not exist" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.418922 4921 scope.go:117] "RemoveContainer" containerID="78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755" Mar 18 13:34:14 crc kubenswrapper[4921]: E0318 13:34:14.419299 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755\": container with ID starting with 78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755 not found: ID does not exist" containerID="78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.419320 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755"} err="failed to get container status \"78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755\": rpc error: code = NotFound desc = could not find container \"78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755\": container with ID starting with 78dc13ac94d8b06faf2de9e364ed194b354f8465e9f00b33b6201b7dbfef5755 not found: ID does not exist" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.419335 4921 scope.go:117] "RemoveContainer" containerID="7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c" Mar 18 13:34:14 crc kubenswrapper[4921]: E0318 13:34:14.419689 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c\": container with ID starting with 7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c not found: ID does not exist" containerID="7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.419723 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c"} err="failed to get container status \"7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c\": rpc error: code = NotFound desc = could not find container \"7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c\": container with ID starting with 7dcad9cba393d4fe184e61712f1a2db9e17e6d23a18e5c57e4052807c556ea4c not found: ID does not exist" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.651048 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.824309 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-utilities\") pod \"bb569385-979b-4451-b835-ad47df97cd88\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.824371 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-catalog-content\") pod \"bb569385-979b-4451-b835-ad47df97cd88\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.824500 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bs4c\" (UniqueName: \"kubernetes.io/projected/bb569385-979b-4451-b835-ad47df97cd88-kube-api-access-4bs4c\") pod \"bb569385-979b-4451-b835-ad47df97cd88\" (UID: \"bb569385-979b-4451-b835-ad47df97cd88\") " Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.825618 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-utilities" (OuterVolumeSpecName: "utilities") pod "bb569385-979b-4451-b835-ad47df97cd88" (UID: "bb569385-979b-4451-b835-ad47df97cd88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.827873 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb569385-979b-4451-b835-ad47df97cd88-kube-api-access-4bs4c" (OuterVolumeSpecName: "kube-api-access-4bs4c") pod "bb569385-979b-4451-b835-ad47df97cd88" (UID: "bb569385-979b-4451-b835-ad47df97cd88"). InnerVolumeSpecName "kube-api-access-4bs4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.856991 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb569385-979b-4451-b835-ad47df97cd88" (UID: "bb569385-979b-4451-b835-ad47df97cd88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.925982 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.926017 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb569385-979b-4451-b835-ad47df97cd88-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:14 crc kubenswrapper[4921]: I0318 13:34:14.926028 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bs4c\" (UniqueName: \"kubernetes.io/projected/bb569385-979b-4451-b835-ad47df97cd88-kube-api-access-4bs4c\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.219073 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" path="/var/lib/kubelet/pods/a9993242-7ce4-4b94-9bad-8af21ec1576c/volumes" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.237497 4921 generic.go:334] "Generic (PLEG): container finished" podID="bb569385-979b-4451-b835-ad47df97cd88" containerID="2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77" exitCode=0 Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.237566 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2g5n" event={"ID":"bb569385-979b-4451-b835-ad47df97cd88","Type":"ContainerDied","Data":"2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77"} Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.237595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2g5n" event={"ID":"bb569385-979b-4451-b835-ad47df97cd88","Type":"ContainerDied","Data":"47c2021f653f519e5538969a3ac188874547551caeee75c685e60bb4795d915e"} Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.237613 4921 scope.go:117] "RemoveContainer" containerID="2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.237593 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2g5n" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.261514 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2g5n"] Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.262480 4921 scope.go:117] "RemoveContainer" containerID="260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.269056 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2g5n"] Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.281291 4921 scope.go:117] "RemoveContainer" containerID="242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.298334 4921 scope.go:117] "RemoveContainer" containerID="2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77" Mar 18 13:34:15 crc kubenswrapper[4921]: E0318 13:34:15.298936 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77\": container with ID starting with 2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77 not found: ID does not exist" containerID="2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.298977 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77"} err="failed to get container status \"2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77\": rpc error: code = NotFound desc = could not find container \"2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77\": container with ID starting with 2bc7b923a0a23df7f556d5a46b13e2f51566b314e441ea9ddd9e951f4c2edb77 not found: ID does not exist" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.299005 4921 scope.go:117] "RemoveContainer" containerID="260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582" Mar 18 13:34:15 crc kubenswrapper[4921]: E0318 13:34:15.299829 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582\": container with ID starting with 260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582 not found: ID does not exist" containerID="260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.299902 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582"} err="failed to get container status \"260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582\": rpc error: code = NotFound desc = could not find container \"260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582\": container with ID starting with 260b09259ba7c9b1d5aa138fecc902bb9bef5df9610532d03261286cea648582 not found: ID does not exist" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.299949 4921 scope.go:117] "RemoveContainer" containerID="242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178" Mar 18 13:34:15 crc kubenswrapper[4921]: E0318 13:34:15.300355 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178\": container with ID starting with 242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178 not found: ID does not exist" containerID="242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178" Mar 18 13:34:15 crc kubenswrapper[4921]: I0318 13:34:15.300388 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178"} err="failed to get container status \"242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178\": rpc error: code = NotFound desc = could not find container \"242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178\": container with ID starting with 242ae8ed0bbb228d0276bf6364ced30395e08a8eaa05dd195a09290901b53178 not found: ID does not exist" Mar 18 13:34:17 crc kubenswrapper[4921]: I0318 13:34:17.217796 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb569385-979b-4451-b835-ad47df97cd88" path="/var/lib/kubelet/pods/bb569385-979b-4451-b835-ad47df97cd88/volumes" Mar 18 13:34:20 crc kubenswrapper[4921]: I0318 13:34:20.209900 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:34:20 crc kubenswrapper[4921]: E0318 13:34:20.210336 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:34:33 crc kubenswrapper[4921]: I0318 13:34:33.209283 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:34:33 crc kubenswrapper[4921]: E0318 13:34:33.209876 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:34:47 crc kubenswrapper[4921]: I0318 13:34:47.209022 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:34:47 crc kubenswrapper[4921]: E0318 13:34:47.210501 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.177885 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bptk7"] Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178824 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="registry-server" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178837 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="registry-server" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178849 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="extract-utilities" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178856 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="extract-utilities" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178866 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="extract-content" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178873 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="extract-content" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178884 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb3bba4-e4ce-47af-aec9-495dc8de6cde" containerName="oc" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178889 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb3bba4-e4ce-47af-aec9-495dc8de6cde" containerName="oc" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178900 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="extract-utilities" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178906 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="extract-utilities" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178919 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="registry-server" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178925 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="registry-server" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.178941 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="extract-content" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.178948 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="extract-content" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.179100 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb3bba4-e4ce-47af-aec9-495dc8de6cde" containerName="oc" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.179132 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9993242-7ce4-4b94-9bad-8af21ec1576c" containerName="registry-server" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.179178 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb569385-979b-4451-b835-ad47df97cd88" containerName="registry-server" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.180290 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.188905 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bptk7"] Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.209440 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:35:02 crc kubenswrapper[4921]: E0318 13:35:02.209742 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.277488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-utilities\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.277606 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwspz\" (UniqueName: \"kubernetes.io/projected/55358e7c-1149-432e-9b25-8e1f26497130-kube-api-access-dwspz\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.277678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-catalog-content\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.380832 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-utilities\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.380930 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwspz\" (UniqueName: \"kubernetes.io/projected/55358e7c-1149-432e-9b25-8e1f26497130-kube-api-access-dwspz\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.380957 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-catalog-content\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.381431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-utilities\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.381526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-catalog-content\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.398778 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwspz\" (UniqueName: \"kubernetes.io/projected/55358e7c-1149-432e-9b25-8e1f26497130-kube-api-access-dwspz\") pod \"redhat-operators-bptk7\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.504583 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:02 crc kubenswrapper[4921]: I0318 13:35:02.943706 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bptk7"] Mar 18 13:35:03 crc kubenswrapper[4921]: I0318 13:35:03.577964 4921 generic.go:334] "Generic (PLEG): container finished" podID="55358e7c-1149-432e-9b25-8e1f26497130" containerID="23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee" exitCode=0 Mar 18 13:35:03 crc kubenswrapper[4921]: I0318 13:35:03.578017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerDied","Data":"23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee"} Mar 18 13:35:03 crc kubenswrapper[4921]: I0318 13:35:03.578043 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerStarted","Data":"47483d19a12b616973bdb4588118cd451ab2ffd7d02f709a35894433b2247b3c"} Mar 18 13:35:04 crc kubenswrapper[4921]: I0318 13:35:04.588773 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerStarted","Data":"1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9"} Mar 18 13:35:05 crc kubenswrapper[4921]: I0318 13:35:05.599074 4921 generic.go:334] "Generic (PLEG): container finished" podID="55358e7c-1149-432e-9b25-8e1f26497130" containerID="1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9" exitCode=0 Mar 18 13:35:05 crc kubenswrapper[4921]: I0318 13:35:05.599216 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerDied","Data":"1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9"} Mar 18 13:35:06 crc kubenswrapper[4921]: I0318 13:35:06.615979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerStarted","Data":"4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d"} Mar 18 13:35:06 crc kubenswrapper[4921]: I0318 13:35:06.639933 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bptk7" podStartSLOduration=2.137737697 podStartE2EDuration="4.63991086s" podCreationTimestamp="2026-03-18 13:35:02 +0000 UTC" firstStartedPulling="2026-03-18 13:35:03.579422788 +0000 UTC m=+5123.129343427" lastFinishedPulling="2026-03-18 13:35:06.081595941 +0000 UTC m=+5125.631516590" observedRunningTime="2026-03-18 13:35:06.632264042 +0000 UTC m=+5126.182184691" watchObservedRunningTime="2026-03-18 13:35:06.63991086 +0000 UTC m=+5126.189831519" Mar 18 13:35:12 crc kubenswrapper[4921]: I0318 13:35:12.505389 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:12 crc kubenswrapper[4921]: I0318 13:35:12.506215 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:12 crc kubenswrapper[4921]: I0318 13:35:12.550941 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:12 crc kubenswrapper[4921]: I0318 13:35:12.704410 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:12 crc kubenswrapper[4921]: I0318 13:35:12.790040 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bptk7"] Mar 18 13:35:14 crc kubenswrapper[4921]: I0318 13:35:14.688949 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bptk7" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="registry-server" containerID="cri-o://4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d" gracePeriod=2 Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.137220 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.197881 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwspz\" (UniqueName: \"kubernetes.io/projected/55358e7c-1149-432e-9b25-8e1f26497130-kube-api-access-dwspz\") pod \"55358e7c-1149-432e-9b25-8e1f26497130\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.197988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-utilities\") pod \"55358e7c-1149-432e-9b25-8e1f26497130\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.198150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-catalog-content\") pod \"55358e7c-1149-432e-9b25-8e1f26497130\" (UID: \"55358e7c-1149-432e-9b25-8e1f26497130\") " Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.198967 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-utilities" (OuterVolumeSpecName: "utilities") pod "55358e7c-1149-432e-9b25-8e1f26497130" (UID: "55358e7c-1149-432e-9b25-8e1f26497130"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.205930 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55358e7c-1149-432e-9b25-8e1f26497130-kube-api-access-dwspz" (OuterVolumeSpecName: "kube-api-access-dwspz") pod "55358e7c-1149-432e-9b25-8e1f26497130" (UID: "55358e7c-1149-432e-9b25-8e1f26497130"). InnerVolumeSpecName "kube-api-access-dwspz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.300674 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.300813 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwspz\" (UniqueName: \"kubernetes.io/projected/55358e7c-1149-432e-9b25-8e1f26497130-kube-api-access-dwspz\") on node \"crc\" DevicePath \"\"" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.690970 4921 generic.go:334] "Generic (PLEG): container finished" podID="55358e7c-1149-432e-9b25-8e1f26497130" containerID="4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d" exitCode=0 Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.691014 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerDied","Data":"4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d"} Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.691042 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bptk7" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.691062 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bptk7" event={"ID":"55358e7c-1149-432e-9b25-8e1f26497130","Type":"ContainerDied","Data":"47483d19a12b616973bdb4588118cd451ab2ffd7d02f709a35894433b2247b3c"} Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.691084 4921 scope.go:117] "RemoveContainer" containerID="4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.709703 4921 scope.go:117] "RemoveContainer" containerID="1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.748326 4921 scope.go:117] "RemoveContainer" containerID="23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.770030 4921 scope.go:117] "RemoveContainer" containerID="4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d" Mar 18 13:35:15 crc kubenswrapper[4921]: E0318 13:35:15.770562 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d\": container with ID starting with 4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d not found: ID does not exist" containerID="4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.770594 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d"} err="failed to get container status \"4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d\": rpc error: code = NotFound desc = could not find container \"4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d\": container with ID starting with 4f5d4182552dc78e18f699bd381c0aa6c638753b773efd24aec0daf6c587905d not found: ID does not exist" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.770621 4921 scope.go:117] "RemoveContainer" containerID="1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9" Mar 18 13:35:15 crc kubenswrapper[4921]: E0318 13:35:15.771055 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9\": container with ID starting with 1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9 not found: ID does not exist" containerID="1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.771082 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9"} err="failed to get container status \"1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9\": rpc error: code = NotFound desc = could not find container \"1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9\": container with ID starting with 1cf3336bc19ed443ffd47caff494d018467b927259071c9f43233ca7c2dc2bd9 not found: ID does not exist" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.771099 4921 scope.go:117] "RemoveContainer" containerID="23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee" Mar 18 13:35:15 crc kubenswrapper[4921]: E0318 13:35:15.771729 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee\": container with ID starting with 23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee not found: ID does not exist" containerID="23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee" Mar 18 13:35:15 crc kubenswrapper[4921]: I0318 13:35:15.771757 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee"} err="failed to get container status \"23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee\": rpc error: code = NotFound desc = could not find container \"23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee\": container with ID starting with 23ea0e4d48482b15d390eaa1ee1b0403f06a52eeda7ba18f592e121b2b3b15ee not found: ID does not exist" Mar 18 13:35:16 crc kubenswrapper[4921]: I0318 13:35:16.210532 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:35:16 crc kubenswrapper[4921]: E0318 13:35:16.211523 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:35:17 crc kubenswrapper[4921]: I0318 13:35:17.167750 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55358e7c-1149-432e-9b25-8e1f26497130" (UID: "55358e7c-1149-432e-9b25-8e1f26497130"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:35:17 crc kubenswrapper[4921]: I0318 13:35:17.231067 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55358e7c-1149-432e-9b25-8e1f26497130-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:35:17 crc kubenswrapper[4921]: I0318 13:35:17.239552 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bptk7"] Mar 18 13:35:17 crc kubenswrapper[4921]: I0318 13:35:17.247307 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bptk7"] Mar 18 13:35:19 crc kubenswrapper[4921]: I0318 13:35:19.219799 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55358e7c-1149-432e-9b25-8e1f26497130" path="/var/lib/kubelet/pods/55358e7c-1149-432e-9b25-8e1f26497130/volumes" Mar 18 13:35:27 crc kubenswrapper[4921]: I0318 13:35:27.209361 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:35:27 crc kubenswrapper[4921]: E0318 13:35:27.210617 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:35:42 crc kubenswrapper[4921]: I0318 13:35:42.211370 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:35:42 crc kubenswrapper[4921]: E0318 13:35:42.212315 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:35:56 crc kubenswrapper[4921]: I0318 13:35:56.209611 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:35:56 crc kubenswrapper[4921]: E0318 13:35:56.210497 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.149459 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564016-pdmm5"] Mar 18 13:36:00 crc kubenswrapper[4921]: E0318 13:36:00.150175 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="extract-content" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.150193 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="extract-content" Mar 18 13:36:00 crc kubenswrapper[4921]: E0318 13:36:00.150204 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.150211 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4921]: E0318 13:36:00.150226 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="extract-utilities" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.150234 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="extract-utilities" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.150429 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="55358e7c-1149-432e-9b25-8e1f26497130" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.151030 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.154785 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.155458 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.156493 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.161243 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-pdmm5"] Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.176619 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6ps7\" (UniqueName: \"kubernetes.io/projected/a16d5b6c-3663-4a39-a43f-e3e055a61a0f-kube-api-access-d6ps7\") pod \"auto-csr-approver-29564016-pdmm5\" (UID: \"a16d5b6c-3663-4a39-a43f-e3e055a61a0f\") " pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.278583 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6ps7\" (UniqueName: \"kubernetes.io/projected/a16d5b6c-3663-4a39-a43f-e3e055a61a0f-kube-api-access-d6ps7\") pod \"auto-csr-approver-29564016-pdmm5\" (UID: \"a16d5b6c-3663-4a39-a43f-e3e055a61a0f\") " pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.299611 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6ps7\" (UniqueName: \"kubernetes.io/projected/a16d5b6c-3663-4a39-a43f-e3e055a61a0f-kube-api-access-d6ps7\") pod \"auto-csr-approver-29564016-pdmm5\" (UID: \"a16d5b6c-3663-4a39-a43f-e3e055a61a0f\") " pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.473150 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:00 crc kubenswrapper[4921]: I0318 13:36:00.908650 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-pdmm5"] Mar 18 13:36:01 crc kubenswrapper[4921]: I0318 13:36:01.084353 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" event={"ID":"a16d5b6c-3663-4a39-a43f-e3e055a61a0f","Type":"ContainerStarted","Data":"f33eb08a32d31355c3a8a2551e9b8c7aab2ce998e76091d74c1befd1c1e71d87"} Mar 18 13:36:03 crc kubenswrapper[4921]: I0318 13:36:03.100924 4921 generic.go:334] "Generic (PLEG): container finished" podID="a16d5b6c-3663-4a39-a43f-e3e055a61a0f" containerID="82a6b2d8e40bbcae7a4fbb44455e662ef41a9baacd819d4a71b64220b3702b3e" exitCode=0 Mar 18 13:36:03 crc kubenswrapper[4921]: I0318 13:36:03.101504 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" event={"ID":"a16d5b6c-3663-4a39-a43f-e3e055a61a0f","Type":"ContainerDied","Data":"82a6b2d8e40bbcae7a4fbb44455e662ef41a9baacd819d4a71b64220b3702b3e"} Mar 18 13:36:04 crc kubenswrapper[4921]: I0318 13:36:04.384121 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:04 crc kubenswrapper[4921]: I0318 13:36:04.548679 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6ps7\" (UniqueName: \"kubernetes.io/projected/a16d5b6c-3663-4a39-a43f-e3e055a61a0f-kube-api-access-d6ps7\") pod \"a16d5b6c-3663-4a39-a43f-e3e055a61a0f\" (UID: \"a16d5b6c-3663-4a39-a43f-e3e055a61a0f\") " Mar 18 13:36:04 crc kubenswrapper[4921]: I0318 13:36:04.554326 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16d5b6c-3663-4a39-a43f-e3e055a61a0f-kube-api-access-d6ps7" (OuterVolumeSpecName: "kube-api-access-d6ps7") pod "a16d5b6c-3663-4a39-a43f-e3e055a61a0f" (UID: "a16d5b6c-3663-4a39-a43f-e3e055a61a0f"). InnerVolumeSpecName "kube-api-access-d6ps7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:36:04 crc kubenswrapper[4921]: I0318 13:36:04.650663 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6ps7\" (UniqueName: \"kubernetes.io/projected/a16d5b6c-3663-4a39-a43f-e3e055a61a0f-kube-api-access-d6ps7\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:05 crc kubenswrapper[4921]: I0318 13:36:05.119892 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" event={"ID":"a16d5b6c-3663-4a39-a43f-e3e055a61a0f","Type":"ContainerDied","Data":"f33eb08a32d31355c3a8a2551e9b8c7aab2ce998e76091d74c1befd1c1e71d87"} Mar 18 13:36:05 crc kubenswrapper[4921]: I0318 13:36:05.119948 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f33eb08a32d31355c3a8a2551e9b8c7aab2ce998e76091d74c1befd1c1e71d87" Mar 18 13:36:05 crc kubenswrapper[4921]: I0318 13:36:05.120008 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-pdmm5" Mar 18 13:36:05 crc kubenswrapper[4921]: I0318 13:36:05.467628 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-857qb"] Mar 18 13:36:05 crc kubenswrapper[4921]: I0318 13:36:05.473579 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-857qb"] Mar 18 13:36:07 crc kubenswrapper[4921]: I0318 13:36:07.209546 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:36:07 crc kubenswrapper[4921]: E0318 13:36:07.210054 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:36:07 crc kubenswrapper[4921]: I0318 13:36:07.226677 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726e7c6c-3238-4ca6-b386-be62ced029f8" path="/var/lib/kubelet/pods/726e7c6c-3238-4ca6-b386-be62ced029f8/volumes" Mar 18 13:36:07 crc kubenswrapper[4921]: I0318 13:36:07.578195 4921 scope.go:117] "RemoveContainer" containerID="95a31e4dc8fc7181298fa4d06a6762fb24ad4d13ae8e95ad1b8ca74e4fd0dde4" Mar 18 13:36:21 crc kubenswrapper[4921]: I0318 13:36:21.214379 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:36:21 crc kubenswrapper[4921]: E0318 13:36:21.215274 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:36:33 crc kubenswrapper[4921]: I0318 13:36:33.208992 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:36:33 crc kubenswrapper[4921]: E0318 13:36:33.211499 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:36:47 crc kubenswrapper[4921]: I0318 13:36:47.209925 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:36:47 crc kubenswrapper[4921]: E0318 13:36:47.211192 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:36:58 crc kubenswrapper[4921]: I0318 13:36:58.210851 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:36:58 crc kubenswrapper[4921]: E0318 13:36:58.211675 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:37:07 crc kubenswrapper[4921]: I0318 13:37:07.659453 4921 scope.go:117] "RemoveContainer" containerID="04524717c112007438a97dc276099a2b35127aa2a98b868a2d2228311fdfc91a" Mar 18 13:37:09 crc kubenswrapper[4921]: I0318 13:37:09.209817 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:37:09 crc kubenswrapper[4921]: E0318 13:37:09.211754 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:37:23 crc kubenswrapper[4921]: I0318 13:37:23.208951 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:37:23 crc kubenswrapper[4921]: E0318 13:37:23.209776 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.760900 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 13:37:31 crc kubenswrapper[4921]: E0318 13:37:31.763965 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16d5b6c-3663-4a39-a43f-e3e055a61a0f" containerName="oc" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.764010 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16d5b6c-3663-4a39-a43f-e3e055a61a0f" containerName="oc" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.764272 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16d5b6c-3663-4a39-a43f-e3e055a61a0f" containerName="oc" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.764992 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.770813 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jz5tc" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.776981 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.944051 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") " pod="openstack/mariadb-copy-data" Mar 18 13:37:31 crc kubenswrapper[4921]: I0318 13:37:31.944151 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd8nh\" (UniqueName: \"kubernetes.io/projected/f1fbd8a8-29d9-4209-8f78-5831fdaf6c62-kube-api-access-xd8nh\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") " pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.045602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") " pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.045684 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd8nh\" (UniqueName: \"kubernetes.io/projected/f1fbd8a8-29d9-4209-8f78-5831fdaf6c62-kube-api-access-xd8nh\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") " pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.048909 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.048964 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0bb4e5678ac5d45e307d4ad3bffa762bf2f2abdbb05dcb8f2986d946d76c2b33/globalmount\"" pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.067431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd8nh\" (UniqueName: \"kubernetes.io/projected/f1fbd8a8-29d9-4209-8f78-5831fdaf6c62-kube-api-access-xd8nh\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") " pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.088768 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1243588b-69ce-4c04-8247-85ae6e9e4a55\") pod \"mariadb-copy-data\" (UID: \"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62\") " pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.094495 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.514662 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.851072 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62","Type":"ContainerStarted","Data":"3909a23d87832a1da52ea086d7402b96607ea85a9d72f3466d3fe277fc3326f0"} Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.851131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f1fbd8a8-29d9-4209-8f78-5831fdaf6c62","Type":"ContainerStarted","Data":"0a8c47f99e94e0edbf82210ecc96152f39b44bb58892c81761f00de0a0bf4633"} Mar 18 13:37:32 crc kubenswrapper[4921]: I0318 13:37:32.873369 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.873348175 podStartE2EDuration="2.873348175s" podCreationTimestamp="2026-03-18 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:37:32.871918404 +0000 UTC m=+5272.421839043" watchObservedRunningTime="2026-03-18 13:37:32.873348175 +0000 UTC m=+5272.423268814" Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.642400 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.644419 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.656840 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.807926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzxg\" (UniqueName: \"kubernetes.io/projected/ff7f2278-ca90-491b-be7e-cc8b1f7183e9-kube-api-access-7fzxg\") pod \"mariadb-client\" (UID: \"ff7f2278-ca90-491b-be7e-cc8b1f7183e9\") " pod="openstack/mariadb-client" Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.910284 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzxg\" (UniqueName: \"kubernetes.io/projected/ff7f2278-ca90-491b-be7e-cc8b1f7183e9-kube-api-access-7fzxg\") pod \"mariadb-client\" (UID: \"ff7f2278-ca90-491b-be7e-cc8b1f7183e9\") " pod="openstack/mariadb-client" Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.934078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzxg\" (UniqueName: \"kubernetes.io/projected/ff7f2278-ca90-491b-be7e-cc8b1f7183e9-kube-api-access-7fzxg\") pod \"mariadb-client\" (UID: \"ff7f2278-ca90-491b-be7e-cc8b1f7183e9\") " pod="openstack/mariadb-client" Mar 18 13:37:35 crc kubenswrapper[4921]: I0318 13:37:35.967837 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:36 crc kubenswrapper[4921]: I0318 13:37:36.473767 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:36 crc kubenswrapper[4921]: I0318 13:37:36.882160 4921 generic.go:334] "Generic (PLEG): container finished" podID="ff7f2278-ca90-491b-be7e-cc8b1f7183e9" containerID="686b62f7964b4d79ab1b9e641d0d3f6df610ee2bb22718bd2feed005e9688a46" exitCode=0 Mar 18 13:37:36 crc kubenswrapper[4921]: I0318 13:37:36.882207 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ff7f2278-ca90-491b-be7e-cc8b1f7183e9","Type":"ContainerDied","Data":"686b62f7964b4d79ab1b9e641d0d3f6df610ee2bb22718bd2feed005e9688a46"} Mar 18 13:37:36 crc kubenswrapper[4921]: I0318 13:37:36.882248 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ff7f2278-ca90-491b-be7e-cc8b1f7183e9","Type":"ContainerStarted","Data":"14b7677016419f5cef7f7acd0184700ffa910b6237423a58781ce2c4b51bf695"} Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.195763 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.209482 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:37:38 crc kubenswrapper[4921]: E0318 13:37:38.209835 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.226098 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ff7f2278-ca90-491b-be7e-cc8b1f7183e9/mariadb-client/0.log" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.262946 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.271889 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.349511 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fzxg\" (UniqueName: \"kubernetes.io/projected/ff7f2278-ca90-491b-be7e-cc8b1f7183e9-kube-api-access-7fzxg\") pod \"ff7f2278-ca90-491b-be7e-cc8b1f7183e9\" (UID: \"ff7f2278-ca90-491b-be7e-cc8b1f7183e9\") " Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.362510 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7f2278-ca90-491b-be7e-cc8b1f7183e9-kube-api-access-7fzxg" (OuterVolumeSpecName: "kube-api-access-7fzxg") pod "ff7f2278-ca90-491b-be7e-cc8b1f7183e9" (UID: "ff7f2278-ca90-491b-be7e-cc8b1f7183e9"). InnerVolumeSpecName "kube-api-access-7fzxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.394747 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:38 crc kubenswrapper[4921]: E0318 13:37:38.395375 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7f2278-ca90-491b-be7e-cc8b1f7183e9" containerName="mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.395404 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7f2278-ca90-491b-be7e-cc8b1f7183e9" containerName="mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.395580 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7f2278-ca90-491b-be7e-cc8b1f7183e9" containerName="mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.396198 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.405534 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.452077 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fzxg\" (UniqueName: \"kubernetes.io/projected/ff7f2278-ca90-491b-be7e-cc8b1f7183e9-kube-api-access-7fzxg\") on node \"crc\" DevicePath \"\"" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.553539 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmfw4\" (UniqueName: \"kubernetes.io/projected/652d174a-df37-437a-aaba-81ca303b7260-kube-api-access-xmfw4\") pod \"mariadb-client\" (UID: \"652d174a-df37-437a-aaba-81ca303b7260\") " pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.655255 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmfw4\" (UniqueName: \"kubernetes.io/projected/652d174a-df37-437a-aaba-81ca303b7260-kube-api-access-xmfw4\") pod \"mariadb-client\" (UID: \"652d174a-df37-437a-aaba-81ca303b7260\") " pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.674239 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmfw4\" (UniqueName: \"kubernetes.io/projected/652d174a-df37-437a-aaba-81ca303b7260-kube-api-access-xmfw4\") pod \"mariadb-client\" (UID: \"652d174a-df37-437a-aaba-81ca303b7260\") " pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.723928 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.912434 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b7677016419f5cef7f7acd0184700ffa910b6237423a58781ce2c4b51bf695" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.912520 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.929518 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="ff7f2278-ca90-491b-be7e-cc8b1f7183e9" podUID="652d174a-df37-437a-aaba-81ca303b7260" Mar 18 13:37:38 crc kubenswrapper[4921]: I0318 13:37:38.970238 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:38 crc kubenswrapper[4921]: W0318 13:37:38.975344 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652d174a_df37_437a_aaba_81ca303b7260.slice/crio-0640c0cf04bf562be1093873ef9bdf9c5dcdbd367d81d87f80ba8443693b683f WatchSource:0}: Error finding container 0640c0cf04bf562be1093873ef9bdf9c5dcdbd367d81d87f80ba8443693b683f: Status 404 returned error can't find the container with id 0640c0cf04bf562be1093873ef9bdf9c5dcdbd367d81d87f80ba8443693b683f Mar 18 13:37:39 crc kubenswrapper[4921]: I0318 13:37:39.220660 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7f2278-ca90-491b-be7e-cc8b1f7183e9" path="/var/lib/kubelet/pods/ff7f2278-ca90-491b-be7e-cc8b1f7183e9/volumes" Mar 18 13:37:39 crc kubenswrapper[4921]: I0318 13:37:39.920472 4921 generic.go:334] "Generic (PLEG): container finished" podID="652d174a-df37-437a-aaba-81ca303b7260" containerID="74205b39a361119a37adf1c916397297fe4f1d0a31dd3d1f01334c12bb92c426" exitCode=0 Mar 18 13:37:39 crc kubenswrapper[4921]: I0318 13:37:39.920533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"652d174a-df37-437a-aaba-81ca303b7260","Type":"ContainerDied","Data":"74205b39a361119a37adf1c916397297fe4f1d0a31dd3d1f01334c12bb92c426"} Mar 18 13:37:39 crc kubenswrapper[4921]: I0318 13:37:39.920577 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"652d174a-df37-437a-aaba-81ca303b7260","Type":"ContainerStarted","Data":"0640c0cf04bf562be1093873ef9bdf9c5dcdbd367d81d87f80ba8443693b683f"} Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.204408 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.230073 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_652d174a-df37-437a-aaba-81ca303b7260/mariadb-client/0.log" Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.258366 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.264297 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.299693 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmfw4\" (UniqueName: \"kubernetes.io/projected/652d174a-df37-437a-aaba-81ca303b7260-kube-api-access-xmfw4\") pod \"652d174a-df37-437a-aaba-81ca303b7260\" (UID: \"652d174a-df37-437a-aaba-81ca303b7260\") " Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.310520 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652d174a-df37-437a-aaba-81ca303b7260-kube-api-access-xmfw4" (OuterVolumeSpecName: "kube-api-access-xmfw4") pod "652d174a-df37-437a-aaba-81ca303b7260" (UID: "652d174a-df37-437a-aaba-81ca303b7260"). InnerVolumeSpecName "kube-api-access-xmfw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.401579 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmfw4\" (UniqueName: \"kubernetes.io/projected/652d174a-df37-437a-aaba-81ca303b7260-kube-api-access-xmfw4\") on node \"crc\" DevicePath \"\"" Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.939588 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0640c0cf04bf562be1093873ef9bdf9c5dcdbd367d81d87f80ba8443693b683f" Mar 18 13:37:41 crc kubenswrapper[4921]: I0318 13:37:41.940143 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 13:37:43 crc kubenswrapper[4921]: I0318 13:37:43.219875 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652d174a-df37-437a-aaba-81ca303b7260" path="/var/lib/kubelet/pods/652d174a-df37-437a-aaba-81ca303b7260/volumes" Mar 18 13:37:50 crc kubenswrapper[4921]: I0318 13:37:50.209227 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:37:51 crc kubenswrapper[4921]: I0318 13:37:51.027577 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"925f41448af3066f1f8956601297f08a71fa65c35e3c953610bfb65d6cea1e9b"} Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.154491 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564018-qln2k"] Mar 18 13:38:00 crc kubenswrapper[4921]: E0318 13:38:00.157577 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652d174a-df37-437a-aaba-81ca303b7260" containerName="mariadb-client" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.157612 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="652d174a-df37-437a-aaba-81ca303b7260" containerName="mariadb-client" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.157824 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="652d174a-df37-437a-aaba-81ca303b7260" containerName="mariadb-client" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.158656 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.161771 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.161891 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.162673 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.168630 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-qln2k"] Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.249612 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbltq\" (UniqueName: \"kubernetes.io/projected/0b3eea75-7a37-491f-969e-fb42b499ed50-kube-api-access-cbltq\") pod \"auto-csr-approver-29564018-qln2k\" (UID: \"0b3eea75-7a37-491f-969e-fb42b499ed50\") " pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.351358 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbltq\" (UniqueName: \"kubernetes.io/projected/0b3eea75-7a37-491f-969e-fb42b499ed50-kube-api-access-cbltq\") pod \"auto-csr-approver-29564018-qln2k\" (UID: \"0b3eea75-7a37-491f-969e-fb42b499ed50\") " pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.379295 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbltq\" (UniqueName: \"kubernetes.io/projected/0b3eea75-7a37-491f-969e-fb42b499ed50-kube-api-access-cbltq\") pod \"auto-csr-approver-29564018-qln2k\" (UID: \"0b3eea75-7a37-491f-969e-fb42b499ed50\") " pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.479706 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:00 crc kubenswrapper[4921]: I0318 13:38:00.925808 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-qln2k"] Mar 18 13:38:01 crc kubenswrapper[4921]: I0318 13:38:01.113651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-qln2k" event={"ID":"0b3eea75-7a37-491f-969e-fb42b499ed50","Type":"ContainerStarted","Data":"19e9384afff1ea2f612ad369293abe2e9c346ff9f981de48f347502f607bd262"} Mar 18 13:38:03 crc kubenswrapper[4921]: I0318 13:38:03.133388 4921 generic.go:334] "Generic (PLEG): container finished" podID="0b3eea75-7a37-491f-969e-fb42b499ed50" containerID="0be02c9084b987d3f4d3b22d032167502c9334c2620a589cbaafe815a0ba47ad" exitCode=0 Mar 18 13:38:03 crc kubenswrapper[4921]: I0318 13:38:03.133449 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-qln2k" event={"ID":"0b3eea75-7a37-491f-969e-fb42b499ed50","Type":"ContainerDied","Data":"0be02c9084b987d3f4d3b22d032167502c9334c2620a589cbaafe815a0ba47ad"} Mar 18 13:38:04 crc kubenswrapper[4921]: I0318 13:38:04.619690 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:04 crc kubenswrapper[4921]: I0318 13:38:04.730835 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbltq\" (UniqueName: \"kubernetes.io/projected/0b3eea75-7a37-491f-969e-fb42b499ed50-kube-api-access-cbltq\") pod \"0b3eea75-7a37-491f-969e-fb42b499ed50\" (UID: \"0b3eea75-7a37-491f-969e-fb42b499ed50\") " Mar 18 13:38:04 crc kubenswrapper[4921]: I0318 13:38:04.738573 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3eea75-7a37-491f-969e-fb42b499ed50-kube-api-access-cbltq" (OuterVolumeSpecName: "kube-api-access-cbltq") pod "0b3eea75-7a37-491f-969e-fb42b499ed50" (UID: "0b3eea75-7a37-491f-969e-fb42b499ed50"). InnerVolumeSpecName "kube-api-access-cbltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:04 crc kubenswrapper[4921]: I0318 13:38:04.832910 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbltq\" (UniqueName: \"kubernetes.io/projected/0b3eea75-7a37-491f-969e-fb42b499ed50-kube-api-access-cbltq\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:05 crc kubenswrapper[4921]: I0318 13:38:05.152891 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-qln2k" event={"ID":"0b3eea75-7a37-491f-969e-fb42b499ed50","Type":"ContainerDied","Data":"19e9384afff1ea2f612ad369293abe2e9c346ff9f981de48f347502f607bd262"} Mar 18 13:38:05 crc kubenswrapper[4921]: I0318 13:38:05.152930 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19e9384afff1ea2f612ad369293abe2e9c346ff9f981de48f347502f607bd262" Mar 18 13:38:05 crc kubenswrapper[4921]: I0318 13:38:05.152941 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-qln2k" Mar 18 13:38:05 crc kubenswrapper[4921]: I0318 13:38:05.705989 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-dg84n"] Mar 18 13:38:05 crc kubenswrapper[4921]: I0318 13:38:05.716725 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-dg84n"] Mar 18 13:38:07 crc kubenswrapper[4921]: I0318 13:38:07.225925 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0681c9b-038a-4c94-b298-fb82286dc29b" path="/var/lib/kubelet/pods/e0681c9b-038a-4c94-b298-fb82286dc29b/volumes" Mar 18 13:38:07 crc kubenswrapper[4921]: I0318 13:38:07.716720 4921 scope.go:117] "RemoveContainer" containerID="94460333591e51847ec87bdbc357debb0ae15eec39e0163f3d5cecdbd91251d1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.412528 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 13:38:33 crc kubenswrapper[4921]: E0318 13:38:33.413937 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3eea75-7a37-491f-969e-fb42b499ed50" containerName="oc" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.413960 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3eea75-7a37-491f-969e-fb42b499ed50" containerName="oc" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.416973 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3eea75-7a37-491f-969e-fb42b499ed50" containerName="oc" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.418155 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.423865 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-p6nk2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.429642 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.432076 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.438847 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.439144 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.439188 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.450225 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.452605 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.458712 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.465158 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.582430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf98ae06-916f-4a34-845c-ab36705146c4-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.582834 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgll\" (UniqueName: \"kubernetes.io/projected/cf98ae06-916f-4a34-845c-ab36705146c4-kube-api-access-tlgll\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.582936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5116f2-ee21-4b85-a2fc-2dac3960be81-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.583689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2299f72-ad34-47f5-9d33-14e55ac2b36e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.583820 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2299f72-ad34-47f5-9d33-14e55ac2b36e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.583965 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5116f2-ee21-4b85-a2fc-2dac3960be81-config\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b589087f-aad6-4565-a86d-e9c75608f01a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b589087f-aad6-4565-a86d-e9c75608f01a\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584183 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5496t\" (UniqueName: \"kubernetes.io/projected/ce5116f2-ee21-4b85-a2fc-2dac3960be81-kube-api-access-5496t\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce5116f2-ee21-4b85-a2fc-2dac3960be81-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2299f72-ad34-47f5-9d33-14e55ac2b36e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584530 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf98ae06-916f-4a34-845c-ab36705146c4-config\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1867426-469d-44c4-8313-0d6a530b6b48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1867426-469d-44c4-8313-0d6a530b6b48\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584710 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf98ae06-916f-4a34-845c-ab36705146c4-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584837 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a2bfde31-c335-4472-8606-b426554ecd82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2bfde31-c335-4472-8606-b426554ecd82\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.584943 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf98ae06-916f-4a34-845c-ab36705146c4-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.585018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2299f72-ad34-47f5-9d33-14e55ac2b36e-config\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.585093 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5116f2-ee21-4b85-a2fc-2dac3960be81-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.585213 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpj8\" (UniqueName: \"kubernetes.io/projected/d2299f72-ad34-47f5-9d33-14e55ac2b36e-kube-api-access-zvpj8\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.602900 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.604650 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.607317 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.609435 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-746pj" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.609491 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.616767 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.627030 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.629107 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.635886 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.637904 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.696707 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1867426-469d-44c4-8313-0d6a530b6b48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1867426-469d-44c4-8313-0d6a530b6b48\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.697253 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf98ae06-916f-4a34-845c-ab36705146c4-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.697482 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca0c85e-8f57-406b-a1c8-c439180a084d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698185 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a2bfde31-c335-4472-8606-b426554ecd82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2bfde31-c335-4472-8606-b426554ecd82\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698388 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf98ae06-916f-4a34-845c-ab36705146c4-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698451 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2299f72-ad34-47f5-9d33-14e55ac2b36e-config\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698519 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c88g\" (UniqueName: \"kubernetes.io/projected/4a9d4713-c699-4829-85ca-0aea43794238-kube-api-access-2c88g\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698689 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5116f2-ee21-4b85-a2fc-2dac3960be81-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698746 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpj8\" (UniqueName: \"kubernetes.io/projected/d2299f72-ad34-47f5-9d33-14e55ac2b36e-kube-api-access-zvpj8\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698815 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ca0c85e-8f57-406b-a1c8-c439180a084d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.698890 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a9d4713-c699-4829-85ca-0aea43794238-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.699040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.699152 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ca0c85e-8f57-406b-a1c8-c439180a084d-config\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.708861 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf98ae06-916f-4a34-845c-ab36705146c4-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709154 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cf98ae06-916f-4a34-845c-ab36705146c4-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709223 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9d4713-c699-4829-85ca-0aea43794238-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgll\" (UniqueName: \"kubernetes.io/projected/cf98ae06-916f-4a34-845c-ab36705146c4-kube-api-access-tlgll\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709329 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5116f2-ee21-4b85-a2fc-2dac3960be81-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709367 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgw9\" (UniqueName: \"kubernetes.io/projected/0ca0c85e-8f57-406b-a1c8-c439180a084d-kube-api-access-tqgw9\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2299f72-ad34-47f5-9d33-14e55ac2b36e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709473 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2299f72-ad34-47f5-9d33-14e55ac2b36e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709536 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a9d4713-c699-4829-85ca-0aea43794238-config\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709738 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2299f72-ad34-47f5-9d33-14e55ac2b36e-config\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.710403 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5116f2-ee21-4b85-a2fc-2dac3960be81-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.710747 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2299f72-ad34-47f5-9d33-14e55ac2b36e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.710904 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf98ae06-916f-4a34-845c-ab36705146c4-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.709607 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ca0c85e-8f57-406b-a1c8-c439180a084d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711276 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a9d4713-c699-4829-85ca-0aea43794238-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711429 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5116f2-ee21-4b85-a2fc-2dac3960be81-config\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b589087f-aad6-4565-a86d-e9c75608f01a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b589087f-aad6-4565-a86d-e9c75608f01a\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711585 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5496t\" (UniqueName: \"kubernetes.io/projected/ce5116f2-ee21-4b85-a2fc-2dac3960be81-kube-api-access-5496t\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711684 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce5116f2-ee21-4b85-a2fc-2dac3960be81-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711702 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2299f72-ad34-47f5-9d33-14e55ac2b36e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.711739 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf98ae06-916f-4a34-845c-ab36705146c4-config\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.712537 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf98ae06-916f-4a34-845c-ab36705146c4-config\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.713219 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d2299f72-ad34-47f5-9d33-14e55ac2b36e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.713514 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce5116f2-ee21-4b85-a2fc-2dac3960be81-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.714146 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5116f2-ee21-4b85-a2fc-2dac3960be81-config\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.717604 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.719887 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.719935 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a2bfde31-c335-4472-8606-b426554ecd82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2bfde31-c335-4472-8606-b426554ecd82\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bcef72531a7fad74be3581d70b9f9158940b0d6916a6e34691a4f3dca76e3340/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.720135 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2299f72-ad34-47f5-9d33-14e55ac2b36e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.720143 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf98ae06-916f-4a34-845c-ab36705146c4-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.720467 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.720497 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1867426-469d-44c4-8313-0d6a530b6b48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1867426-469d-44c4-8313-0d6a530b6b48\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/facd462c473ea7c3b80ac0cba5739c83d2769f9f4e299b49c4b507fdaab181f6/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.720469 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.720649 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b589087f-aad6-4565-a86d-e9c75608f01a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b589087f-aad6-4565-a86d-e9c75608f01a\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9eb5cde8bc3bd9671819e9bc52ff8a3a499e5882f7524aec44cd9905072f5a46/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.722785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpj8\" (UniqueName: \"kubernetes.io/projected/d2299f72-ad34-47f5-9d33-14e55ac2b36e-kube-api-access-zvpj8\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.726510 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgll\" (UniqueName: \"kubernetes.io/projected/cf98ae06-916f-4a34-845c-ab36705146c4-kube-api-access-tlgll\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.731616 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5496t\" (UniqueName: \"kubernetes.io/projected/ce5116f2-ee21-4b85-a2fc-2dac3960be81-kube-api-access-5496t\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.732183 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5116f2-ee21-4b85-a2fc-2dac3960be81-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.743854 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.758072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b589087f-aad6-4565-a86d-e9c75608f01a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b589087f-aad6-4565-a86d-e9c75608f01a\") pod \"ovsdbserver-nb-0\" (UID: \"ce5116f2-ee21-4b85-a2fc-2dac3960be81\") " pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.759145 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a2bfde31-c335-4472-8606-b426554ecd82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a2bfde31-c335-4472-8606-b426554ecd82\") pod \"ovsdbserver-nb-2\" (UID: \"cf98ae06-916f-4a34-845c-ab36705146c4\") " pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.766101 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1867426-469d-44c4-8313-0d6a530b6b48\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1867426-469d-44c4-8313-0d6a530b6b48\") pod \"ovsdbserver-nb-1\" (UID: \"d2299f72-ad34-47f5-9d33-14e55ac2b36e\") " pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.798043 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.812870 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813586 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813657 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-config\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813702 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca0c85e-8f57-406b-a1c8-c439180a084d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813767 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c88g\" (UniqueName: \"kubernetes.io/projected/4a9d4713-c699-4829-85ca-0aea43794238-kube-api-access-2c88g\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ca0c85e-8f57-406b-a1c8-c439180a084d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813812 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a9d4713-c699-4829-85ca-0aea43794238-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813850 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813872 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ca0c85e-8f57-406b-a1c8-c439180a084d-config\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813902 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9d4713-c699-4829-85ca-0aea43794238-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813930 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgw9\" (UniqueName: \"kubernetes.io/projected/0ca0c85e-8f57-406b-a1c8-c439180a084d-kube-api-access-tqgw9\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813951 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a9d4713-c699-4829-85ca-0aea43794238-config\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813974 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2mc8\" (UniqueName: \"kubernetes.io/projected/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-kube-api-access-v2mc8\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.813998 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56a8ae72-4bba-471b-a588-34989e672e20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56a8ae72-4bba-471b-a588-34989e672e20\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.814020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ca0c85e-8f57-406b-a1c8-c439180a084d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.814038 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a9d4713-c699-4829-85ca-0aea43794238-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.814074 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.814153 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.815887 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a9d4713-c699-4829-85ca-0aea43794238-config\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.815923 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a9d4713-c699-4829-85ca-0aea43794238-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.816231 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a9d4713-c699-4829-85ca-0aea43794238-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.816838 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.817223 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ca0c85e-8f57-406b-a1c8-c439180a084d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.817446 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/543a089c1dfa1a3ea02806f4897024fe1cc07b6263ab31cc955b625791781b96/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.817659 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ca0c85e-8f57-406b-a1c8-c439180a084d-config\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.819188 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.819305 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/77249dfc4a7f5c53c3ed408019bc03dd5e2d782c22b037be3869cee2351faa9a/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.819403 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca0c85e-8f57-406b-a1c8-c439180a084d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.819254 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ca0c85e-8f57-406b-a1c8-c439180a084d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.823218 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9d4713-c699-4829-85ca-0aea43794238-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.828680 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.832262 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgw9\" (UniqueName: \"kubernetes.io/projected/0ca0c85e-8f57-406b-a1c8-c439180a084d-kube-api-access-tqgw9\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.833588 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c88g\" (UniqueName: \"kubernetes.io/projected/4a9d4713-c699-4829-85ca-0aea43794238-kube-api-access-2c88g\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.850600 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46276708-9dd4-4943-9fd4-c4d0e8a8a75d\") pod \"ovsdbserver-sb-1\" (UID: \"0ca0c85e-8f57-406b-a1c8-c439180a084d\") " pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.855685 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f012a54-c7b7-4090-a1f3-0c46f34fcfe8\") pod \"ovsdbserver-sb-0\" (UID: \"4a9d4713-c699-4829-85ca-0aea43794238\") " pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.914673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2mc8\" (UniqueName: \"kubernetes.io/projected/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-kube-api-access-v2mc8\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.914720 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56a8ae72-4bba-471b-a588-34989e672e20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56a8ae72-4bba-471b-a588-34989e672e20\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.914880 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.914927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.914955 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.914973 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-config\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.916219 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.916882 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.918542 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-config\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.923419 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.925534 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.927089 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.927140 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56a8ae72-4bba-471b-a588-34989e672e20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56a8ae72-4bba-471b-a588-34989e672e20\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9763f5251751a78e6d0d9ffc6c689561e3f2a998c4765fcc3fa391252db9dee0/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.943821 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2mc8\" (UniqueName: \"kubernetes.io/projected/7187adc5-d2b8-41cd-9ad8-08e8457d30e7-kube-api-access-v2mc8\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.985945 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:33 crc kubenswrapper[4921]: I0318 13:38:33.990598 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56a8ae72-4bba-471b-a588-34989e672e20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56a8ae72-4bba-471b-a588-34989e672e20\") pod \"ovsdbserver-sb-2\" (UID: \"7187adc5-d2b8-41cd-9ad8-08e8457d30e7\") " pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:34 crc kubenswrapper[4921]: I0318 13:38:34.096310 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:34 crc kubenswrapper[4921]: I0318 13:38:34.415724 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 13:38:34 crc kubenswrapper[4921]: I0318 13:38:34.518323 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 13:38:34 crc kubenswrapper[4921]: W0318 13:38:34.524853 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2299f72_ad34_47f5_9d33_14e55ac2b36e.slice/crio-71cef8bebe63bc18d623724f65f11b6320d095877aad71bbbd82de13d17703fc WatchSource:0}: Error finding container 71cef8bebe63bc18d623724f65f11b6320d095877aad71bbbd82de13d17703fc: Status 404 returned error can't find the container with id 71cef8bebe63bc18d623724f65f11b6320d095877aad71bbbd82de13d17703fc Mar 18 13:38:34 crc kubenswrapper[4921]: I0318 13:38:34.615815 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 13:38:34 crc kubenswrapper[4921]: W0318 13:38:34.616459 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a9d4713_c699_4829_85ca_0aea43794238.slice/crio-003acbaf401f55a2db5bed07febc164c79840b9091196c40315c398f4862647a WatchSource:0}: Error finding container 003acbaf401f55a2db5bed07febc164c79840b9091196c40315c398f4862647a: Status 404 returned error can't find the container with id 003acbaf401f55a2db5bed07febc164c79840b9091196c40315c398f4862647a Mar 18 13:38:34 crc kubenswrapper[4921]: I0318 13:38:34.750809 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.329705 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.411968 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a9d4713-c699-4829-85ca-0aea43794238","Type":"ContainerStarted","Data":"3c949ff907d15034a5fed33f3790f5858de95c4c61f29e9828ace6d1b0a32e25"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.412038 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a9d4713-c699-4829-85ca-0aea43794238","Type":"ContainerStarted","Data":"9782291fa42dea605440c5cd218317fa4d359aed2e621195385fc965ff72941c"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.412056 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a9d4713-c699-4829-85ca-0aea43794238","Type":"ContainerStarted","Data":"003acbaf401f55a2db5bed07febc164c79840b9091196c40315c398f4862647a"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.416017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ce5116f2-ee21-4b85-a2fc-2dac3960be81","Type":"ContainerStarted","Data":"5ab28166a7499de56c135e3c32a7615f2fee685f0262ac13aa7de27e0db8cdaa"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.416071 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ce5116f2-ee21-4b85-a2fc-2dac3960be81","Type":"ContainerStarted","Data":"6b537476a45928100dc0a5e9d93ca826d5cf5b64ef39421e319c85bd85b37d1e"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.416082 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ce5116f2-ee21-4b85-a2fc-2dac3960be81","Type":"ContainerStarted","Data":"bf88d65a8d4ccfee2948a356c4af13a793ff7fee1beaa574cb56438d781b0005"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.419949 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"7187adc5-d2b8-41cd-9ad8-08e8457d30e7","Type":"ContainerStarted","Data":"59b827d6f24f5c4b59a81baafad08da152ebdd0ebbaf52bc43553f16b3eff469"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.419987 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"7187adc5-d2b8-41cd-9ad8-08e8457d30e7","Type":"ContainerStarted","Data":"2059cf248bfd4cb465a72cfaaee6e67231077d88f94fa9677f79ff057d007a8b"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.419997 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"7187adc5-d2b8-41cd-9ad8-08e8457d30e7","Type":"ContainerStarted","Data":"743085569a051b9812edcb88fcfd4ea1691d6e4f2ac02781eb8b1b43ec37adec"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.421876 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"cf98ae06-916f-4a34-845c-ab36705146c4","Type":"ContainerStarted","Data":"c20117071e19a8f4fd55127bb66bde1f1e3ae22328a1be2fbcd4eabcc69ccf50"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.424689 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d2299f72-ad34-47f5-9d33-14e55ac2b36e","Type":"ContainerStarted","Data":"3948c3222678f2a5bda2d18847d00c38896ba8fefb564926cd8a8bee403a3464"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.424722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d2299f72-ad34-47f5-9d33-14e55ac2b36e","Type":"ContainerStarted","Data":"a3bc8502a7bb3a559ac904c473331acb9d12c3cd866e16658b21d9da967c24da"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.424732 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d2299f72-ad34-47f5-9d33-14e55ac2b36e","Type":"ContainerStarted","Data":"71cef8bebe63bc18d623724f65f11b6320d095877aad71bbbd82de13d17703fc"} Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.465844 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.465808022 podStartE2EDuration="3.465808022s" podCreationTimestamp="2026-03-18 13:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:35.444689479 +0000 UTC m=+5334.994610118" watchObservedRunningTime="2026-03-18 13:38:35.465808022 +0000 UTC m=+5335.015728661" Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.468644 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.468610662 podStartE2EDuration="3.468610662s" podCreationTimestamp="2026-03-18 13:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:35.464718651 +0000 UTC m=+5335.014639290" watchObservedRunningTime="2026-03-18 13:38:35.468610662 +0000 UTC m=+5335.018531301" Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.489642 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.489619602 podStartE2EDuration="3.489619602s" podCreationTimestamp="2026-03-18 13:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:35.488760068 +0000 UTC m=+5335.038680707" watchObservedRunningTime="2026-03-18 13:38:35.489619602 +0000 UTC m=+5335.039540241" Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.514471 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.514443721 podStartE2EDuration="3.514443721s" podCreationTimestamp="2026-03-18 13:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:35.511441595 +0000 UTC m=+5335.061362234" watchObservedRunningTime="2026-03-18 13:38:35.514443721 +0000 UTC m=+5335.064364360" Mar 18 13:38:35 crc kubenswrapper[4921]: I0318 13:38:35.543490 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 13:38:35 crc kubenswrapper[4921]: W0318 13:38:35.544937 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ca0c85e_8f57_406b_a1c8_c439180a084d.slice/crio-dd687d459555d46fe5f4a2c850ed45708831a0406ed62503ae82fc828164bbb0 WatchSource:0}: Error finding container dd687d459555d46fe5f4a2c850ed45708831a0406ed62503ae82fc828164bbb0: Status 404 returned error can't find the container with id dd687d459555d46fe5f4a2c850ed45708831a0406ed62503ae82fc828164bbb0 Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.437607 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"0ca0c85e-8f57-406b-a1c8-c439180a084d","Type":"ContainerStarted","Data":"bad5c952e88550afb7fd16133bfdeedca8da58e0bca68c367238e37e9ef0bb8f"} Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.438021 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"0ca0c85e-8f57-406b-a1c8-c439180a084d","Type":"ContainerStarted","Data":"505793315ddbeba21852c73a180401863c88a61fb2605cfd44adfe0954fcc257"} Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.438037 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"0ca0c85e-8f57-406b-a1c8-c439180a084d","Type":"ContainerStarted","Data":"dd687d459555d46fe5f4a2c850ed45708831a0406ed62503ae82fc828164bbb0"} Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.440428 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"cf98ae06-916f-4a34-845c-ab36705146c4","Type":"ContainerStarted","Data":"6ca99c4ea20e3af7c3b2ba06a8d3ac4396ee19b191201ef6a52ba1b88e32dd47"} Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.440465 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"cf98ae06-916f-4a34-845c-ab36705146c4","Type":"ContainerStarted","Data":"9ac6cbe0fd55ede967c9357da461c6d7a22fc6fb9135f017bd677b676758c766"} Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.480369 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.480324445 podStartE2EDuration="4.480324445s" podCreationTimestamp="2026-03-18 13:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:36.476016972 +0000 UTC m=+5336.025937611" watchObservedRunningTime="2026-03-18 13:38:36.480324445 +0000 UTC m=+5336.030245104" Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.507584 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.507549062 podStartE2EDuration="4.507549062s" podCreationTimestamp="2026-03-18 13:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:36.501257502 +0000 UTC m=+5336.051178141" watchObservedRunningTime="2026-03-18 13:38:36.507549062 +0000 UTC m=+5336.057469701" Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.798579 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.813424 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.829967 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.926629 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:36 crc kubenswrapper[4921]: I0318 13:38:36.988084 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:37 crc kubenswrapper[4921]: I0318 13:38:37.097863 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:38 crc kubenswrapper[4921]: I0318 13:38:38.798435 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:38 crc kubenswrapper[4921]: I0318 13:38:38.813587 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:38 crc kubenswrapper[4921]: I0318 13:38:38.830212 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:38 crc kubenswrapper[4921]: I0318 13:38:38.926091 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:38 crc kubenswrapper[4921]: I0318 13:38:38.986483 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.098182 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.840943 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.866253 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.891569 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.901812 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.935481 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 18 13:38:39 crc kubenswrapper[4921]: I0318 13:38:39.979018 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.038684 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.046782 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.140664 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-hrk5p"] Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.143097 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.145497 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.159009 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-hrk5p"] Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.169098 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-dns-svc\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.169905 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grdzw\" (UniqueName: \"kubernetes.io/projected/849818fe-592d-451a-a7e0-8b972dd9cab2-kube-api-access-grdzw\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.169986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-ovsdbserver-nb\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.170007 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-config\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.178887 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.233874 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.271901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-dns-svc\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.272031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grdzw\" (UniqueName: \"kubernetes.io/projected/849818fe-592d-451a-a7e0-8b972dd9cab2-kube-api-access-grdzw\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.272150 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-ovsdbserver-nb\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.272170 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-config\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.274305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-dns-svc\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.274753 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-ovsdbserver-nb\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.275139 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-config\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.313900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grdzw\" (UniqueName: \"kubernetes.io/projected/849818fe-592d-451a-a7e0-8b972dd9cab2-kube-api-access-grdzw\") pod \"dnsmasq-dns-547968cc8f-hrk5p\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.477956 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.522408 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.561593 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.599825 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-hrk5p"] Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.666747 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-tzx4q"] Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.678020 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.687919 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.702270 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-tzx4q"] Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.790621 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-config\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.790708 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-nb\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.790740 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-sb\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.790776 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-dns-svc\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.790814 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49mf\" (UniqueName: \"kubernetes.io/projected/94ba0bd3-9942-4de3-9a07-3628472692ec-kube-api-access-j49mf\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.893386 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-config\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.894667 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-config\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.897537 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-nb\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.897628 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-sb\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.897810 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-dns-svc\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.897937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49mf\" (UniqueName: \"kubernetes.io/projected/94ba0bd3-9942-4de3-9a07-3628472692ec-kube-api-access-j49mf\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.898272 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-nb\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.898603 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-sb\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.898854 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-dns-svc\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:40 crc kubenswrapper[4921]: I0318 13:38:40.927127 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49mf\" (UniqueName: \"kubernetes.io/projected/94ba0bd3-9942-4de3-9a07-3628472692ec-kube-api-access-j49mf\") pod \"dnsmasq-dns-7c54468fdc-tzx4q\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.023052 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.157479 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-hrk5p"] Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.486374 4921 generic.go:334] "Generic (PLEG): container finished" podID="849818fe-592d-451a-a7e0-8b972dd9cab2" containerID="c7e2d3bb97356626d4cb028b55b1d75fbb6a9ce029f30dae9b190ca0cf7030ee" exitCode=0 Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.486847 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" event={"ID":"849818fe-592d-451a-a7e0-8b972dd9cab2","Type":"ContainerDied","Data":"c7e2d3bb97356626d4cb028b55b1d75fbb6a9ce029f30dae9b190ca0cf7030ee"} Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.486932 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" event={"ID":"849818fe-592d-451a-a7e0-8b972dd9cab2","Type":"ContainerStarted","Data":"0f26a1194f5b9dc7e469104211b1e92efd076c0b35dce89b12bdf68ea8ecd6e3"} Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.711658 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-tzx4q"] Mar 18 13:38:41 crc kubenswrapper[4921]: I0318 13:38:41.885256 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.019331 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grdzw\" (UniqueName: \"kubernetes.io/projected/849818fe-592d-451a-a7e0-8b972dd9cab2-kube-api-access-grdzw\") pod \"849818fe-592d-451a-a7e0-8b972dd9cab2\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.019439 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-dns-svc\") pod \"849818fe-592d-451a-a7e0-8b972dd9cab2\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.019503 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-ovsdbserver-nb\") pod \"849818fe-592d-451a-a7e0-8b972dd9cab2\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.019568 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-config\") pod \"849818fe-592d-451a-a7e0-8b972dd9cab2\" (UID: \"849818fe-592d-451a-a7e0-8b972dd9cab2\") " Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.025189 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849818fe-592d-451a-a7e0-8b972dd9cab2-kube-api-access-grdzw" (OuterVolumeSpecName: "kube-api-access-grdzw") pod "849818fe-592d-451a-a7e0-8b972dd9cab2" (UID: "849818fe-592d-451a-a7e0-8b972dd9cab2"). InnerVolumeSpecName "kube-api-access-grdzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.041096 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-config" (OuterVolumeSpecName: "config") pod "849818fe-592d-451a-a7e0-8b972dd9cab2" (UID: "849818fe-592d-451a-a7e0-8b972dd9cab2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.042988 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "849818fe-592d-451a-a7e0-8b972dd9cab2" (UID: "849818fe-592d-451a-a7e0-8b972dd9cab2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.052416 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "849818fe-592d-451a-a7e0-8b972dd9cab2" (UID: "849818fe-592d-451a-a7e0-8b972dd9cab2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.122298 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.122352 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grdzw\" (UniqueName: \"kubernetes.io/projected/849818fe-592d-451a-a7e0-8b972dd9cab2-kube-api-access-grdzw\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.122367 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.122380 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/849818fe-592d-451a-a7e0-8b972dd9cab2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.495884 4921 generic.go:334] "Generic (PLEG): container finished" podID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerID="056d83686f459b812f163ff110c22948adda33a51c92014cd85737a84fe4c4f1" exitCode=0 Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.496340 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" event={"ID":"94ba0bd3-9942-4de3-9a07-3628472692ec","Type":"ContainerDied","Data":"056d83686f459b812f163ff110c22948adda33a51c92014cd85737a84fe4c4f1"} Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.496434 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" event={"ID":"94ba0bd3-9942-4de3-9a07-3628472692ec","Type":"ContainerStarted","Data":"7f46ee73bef2932573ef7d243be79a60c643fadf641934894e7b10d7f74ffd54"} Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.498094 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" event={"ID":"849818fe-592d-451a-a7e0-8b972dd9cab2","Type":"ContainerDied","Data":"0f26a1194f5b9dc7e469104211b1e92efd076c0b35dce89b12bdf68ea8ecd6e3"} Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.498188 4921 scope.go:117] "RemoveContainer" containerID="c7e2d3bb97356626d4cb028b55b1d75fbb6a9ce029f30dae9b190ca0cf7030ee" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.498188 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547968cc8f-hrk5p" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.691274 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-hrk5p"] Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.700689 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-547968cc8f-hrk5p"] Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.962148 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 18 13:38:42 crc kubenswrapper[4921]: E0318 13:38:42.962681 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849818fe-592d-451a-a7e0-8b972dd9cab2" containerName="init" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.962697 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="849818fe-592d-451a-a7e0-8b972dd9cab2" containerName="init" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.962897 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="849818fe-592d-451a-a7e0-8b972dd9cab2" containerName="init" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.963831 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.968366 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 18 13:38:42 crc kubenswrapper[4921]: I0318 13:38:42.989593 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.137903 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.137990 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4452db3b-5263-4112-947b-d530be2e236d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4452db3b-5263-4112-947b-d530be2e236d\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.138404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6dz\" (UniqueName: \"kubernetes.io/projected/fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae-kube-api-access-fw6dz\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.220332 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849818fe-592d-451a-a7e0-8b972dd9cab2" path="/var/lib/kubelet/pods/849818fe-592d-451a-a7e0-8b972dd9cab2/volumes" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.240084 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6dz\" (UniqueName: \"kubernetes.io/projected/fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae-kube-api-access-fw6dz\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.240181 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.240283 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4452db3b-5263-4112-947b-d530be2e236d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4452db3b-5263-4112-947b-d530be2e236d\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.245284 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.245356 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4452db3b-5263-4112-947b-d530be2e236d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4452db3b-5263-4112-947b-d530be2e236d\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fcf930b78baced418b12b309fce873db4b255a5d188acd70ebfc89a5338d34e/globalmount\"" pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.250793 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.273154 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6dz\" (UniqueName: \"kubernetes.io/projected/fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae-kube-api-access-fw6dz\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.275681 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4452db3b-5263-4112-947b-d530be2e236d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4452db3b-5263-4112-947b-d530be2e236d\") pod \"ovn-copy-data\" (UID: \"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae\") " pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.290647 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.511815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" event={"ID":"94ba0bd3-9942-4de3-9a07-3628472692ec","Type":"ContainerStarted","Data":"2b763669d19e3e400b68fdce97ae2c185f2aee49c7f398fb45f1f875b0f3bd8a"} Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.511993 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.533035 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" podStartSLOduration=3.533014128 podStartE2EDuration="3.533014128s" podCreationTimestamp="2026-03-18 13:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:43.53095526 +0000 UTC m=+5343.080875919" watchObservedRunningTime="2026-03-18 13:38:43.533014128 +0000 UTC m=+5343.082934767" Mar 18 13:38:43 crc kubenswrapper[4921]: I0318 13:38:43.874557 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 13:38:44 crc kubenswrapper[4921]: I0318 13:38:44.526849 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae","Type":"ContainerStarted","Data":"e5839abb45851c98263e36d43a6422c9ebdf9945ab5ec5c6c2394218c1ff4bdf"} Mar 18 13:38:44 crc kubenswrapper[4921]: I0318 13:38:44.526904 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae","Type":"ContainerStarted","Data":"3f315b65379f8850523b5ad54b2c293d71bab8c38f9486cabfe7ef0f95f529c9"} Mar 18 13:38:44 crc kubenswrapper[4921]: I0318 13:38:44.551488 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.551461194 podStartE2EDuration="3.551461194s" podCreationTimestamp="2026-03-18 13:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:44.544502355 +0000 UTC m=+5344.094423004" watchObservedRunningTime="2026-03-18 13:38:44.551461194 +0000 UTC m=+5344.101381833" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.338279 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.340232 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.343019 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.343976 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.363583 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4hktn" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.365975 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.474328 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-scripts\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.474500 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2pc\" (UniqueName: \"kubernetes.io/projected/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-kube-api-access-rf2pc\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.474621 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.474712 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-config\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.474754 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.576775 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.576846 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-config\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.576867 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.576909 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-scripts\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.576961 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2pc\" (UniqueName: \"kubernetes.io/projected/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-kube-api-access-rf2pc\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.577855 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.578413 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-scripts\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.578563 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-config\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.588288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.602570 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2pc\" (UniqueName: \"kubernetes.io/projected/249b8f41-3ce7-4129-aa35-ac8b35f83aa0-kube-api-access-rf2pc\") pod \"ovn-northd-0\" (UID: \"249b8f41-3ce7-4129-aa35-ac8b35f83aa0\") " pod="openstack/ovn-northd-0" Mar 18 13:38:50 crc kubenswrapper[4921]: I0318 13:38:50.663261 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.025372 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.157592 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dwm22"] Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.157967 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" podUID="127e0d98-4617-4732-88a5-808f6436511d" containerName="dnsmasq-dns" containerID="cri-o://7724fbe91b20cafc1927234f3f92193f3fd6024243b4839596a1bb2b704fc489" gracePeriod=10 Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.182905 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.608899 4921 generic.go:334] "Generic (PLEG): container finished" podID="127e0d98-4617-4732-88a5-808f6436511d" containerID="7724fbe91b20cafc1927234f3f92193f3fd6024243b4839596a1bb2b704fc489" exitCode=0 Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.608992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" event={"ID":"127e0d98-4617-4732-88a5-808f6436511d","Type":"ContainerDied","Data":"7724fbe91b20cafc1927234f3f92193f3fd6024243b4839596a1bb2b704fc489"} Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.611953 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"249b8f41-3ce7-4129-aa35-ac8b35f83aa0","Type":"ContainerStarted","Data":"999aa868e0455c62e847bef8258cb386c3f52d3b3b2ef29c04462b6b9eff9b48"} Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.611994 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"249b8f41-3ce7-4129-aa35-ac8b35f83aa0","Type":"ContainerStarted","Data":"977e464fea9469fb7bd4059ef18f65c9b49544dbaa390e07fd979c47e8df64e4"} Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.787031 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.916217 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrlpw\" (UniqueName: \"kubernetes.io/projected/127e0d98-4617-4732-88a5-808f6436511d-kube-api-access-jrlpw\") pod \"127e0d98-4617-4732-88a5-808f6436511d\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.916427 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-dns-svc\") pod \"127e0d98-4617-4732-88a5-808f6436511d\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.916531 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-config\") pod \"127e0d98-4617-4732-88a5-808f6436511d\" (UID: \"127e0d98-4617-4732-88a5-808f6436511d\") " Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.922562 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127e0d98-4617-4732-88a5-808f6436511d-kube-api-access-jrlpw" (OuterVolumeSpecName: "kube-api-access-jrlpw") pod "127e0d98-4617-4732-88a5-808f6436511d" (UID: "127e0d98-4617-4732-88a5-808f6436511d"). InnerVolumeSpecName "kube-api-access-jrlpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.957804 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-config" (OuterVolumeSpecName: "config") pod "127e0d98-4617-4732-88a5-808f6436511d" (UID: "127e0d98-4617-4732-88a5-808f6436511d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:38:51 crc kubenswrapper[4921]: I0318 13:38:51.959929 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "127e0d98-4617-4732-88a5-808f6436511d" (UID: "127e0d98-4617-4732-88a5-808f6436511d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.019508 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.019544 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/127e0d98-4617-4732-88a5-808f6436511d-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.019555 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrlpw\" (UniqueName: \"kubernetes.io/projected/127e0d98-4617-4732-88a5-808f6436511d-kube-api-access-jrlpw\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.621558 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"249b8f41-3ce7-4129-aa35-ac8b35f83aa0","Type":"ContainerStarted","Data":"8b63d0b7c775d9f8e0d076d12b06638e62832f14c1bd9ab41d06c35af636b44f"} Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.622672 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.624455 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" event={"ID":"127e0d98-4617-4732-88a5-808f6436511d","Type":"ContainerDied","Data":"aa8e52c7ff55181041819ed101b2c12d822cb5414f7c6cf6863277e047a58d44"} Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.624487 4921 scope.go:117] "RemoveContainer" containerID="7724fbe91b20cafc1927234f3f92193f3fd6024243b4839596a1bb2b704fc489" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.624580 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-dwm22" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.655397 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.655366338 podStartE2EDuration="2.655366338s" podCreationTimestamp="2026-03-18 13:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:52.649503831 +0000 UTC m=+5352.199424490" watchObservedRunningTime="2026-03-18 13:38:52.655366338 +0000 UTC m=+5352.205286977" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.662971 4921 scope.go:117] "RemoveContainer" containerID="e4d92b0e2461a07eb022a09429ebe31489e94675bcc91be4d67e861e997f88a5" Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.675238 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dwm22"] Mar 18 13:38:52 crc kubenswrapper[4921]: I0318 13:38:52.700913 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-dwm22"] Mar 18 13:38:53 crc kubenswrapper[4921]: I0318 13:38:53.219547 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127e0d98-4617-4732-88a5-808f6436511d" path="/var/lib/kubelet/pods/127e0d98-4617-4732-88a5-808f6436511d/volumes" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.834402 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bclj2"] Mar 18 13:38:55 crc kubenswrapper[4921]: E0318 13:38:55.835195 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e0d98-4617-4732-88a5-808f6436511d" containerName="dnsmasq-dns" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.835213 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e0d98-4617-4732-88a5-808f6436511d" containerName="dnsmasq-dns" Mar 18 13:38:55 crc kubenswrapper[4921]: E0318 13:38:55.835222 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e0d98-4617-4732-88a5-808f6436511d" containerName="init" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.835228 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e0d98-4617-4732-88a5-808f6436511d" containerName="init" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.835399 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="127e0d98-4617-4732-88a5-808f6436511d" containerName="dnsmasq-dns" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.836073 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.851069 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bclj2"] Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.877943 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2ee0-account-create-update-fg6v2"] Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.879806 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.882641 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.898138 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2ee0-account-create-update-fg6v2"] Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.991179 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlv5g\" (UniqueName: \"kubernetes.io/projected/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-kube-api-access-xlv5g\") pod \"keystone-db-create-bclj2\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.991328 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-operator-scripts\") pod \"keystone-2ee0-account-create-update-fg6v2\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.991445 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-operator-scripts\") pod \"keystone-db-create-bclj2\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:55 crc kubenswrapper[4921]: I0318 13:38:55.991503 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76xb\" (UniqueName: \"kubernetes.io/projected/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-kube-api-access-x76xb\") pod \"keystone-2ee0-account-create-update-fg6v2\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.092980 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76xb\" (UniqueName: \"kubernetes.io/projected/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-kube-api-access-x76xb\") pod \"keystone-2ee0-account-create-update-fg6v2\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.093176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlv5g\" (UniqueName: \"kubernetes.io/projected/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-kube-api-access-xlv5g\") pod \"keystone-db-create-bclj2\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.093212 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-operator-scripts\") pod \"keystone-2ee0-account-create-update-fg6v2\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.093256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-operator-scripts\") pod \"keystone-db-create-bclj2\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.094171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-operator-scripts\") pod \"keystone-db-create-bclj2\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.095300 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-operator-scripts\") pod \"keystone-2ee0-account-create-update-fg6v2\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.120861 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlv5g\" (UniqueName: \"kubernetes.io/projected/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-kube-api-access-xlv5g\") pod \"keystone-db-create-bclj2\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.124035 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76xb\" (UniqueName: \"kubernetes.io/projected/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-kube-api-access-x76xb\") pod \"keystone-2ee0-account-create-update-fg6v2\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.157818 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bclj2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.205821 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.697679 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bclj2"] Mar 18 13:38:56 crc kubenswrapper[4921]: W0318 13:38:56.715312 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc7b3690_bd67_4875_bacb_5bfb5ff7c161.slice/crio-91700bf2910961456598989ede2199665a872da3271931e1801c480b5967a878 WatchSource:0}: Error finding container 91700bf2910961456598989ede2199665a872da3271931e1801c480b5967a878: Status 404 returned error can't find the container with id 91700bf2910961456598989ede2199665a872da3271931e1801c480b5967a878 Mar 18 13:38:56 crc kubenswrapper[4921]: I0318 13:38:56.789202 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2ee0-account-create-update-fg6v2"] Mar 18 13:38:57 crc kubenswrapper[4921]: I0318 13:38:57.666787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bclj2" event={"ID":"fc7b3690-bd67-4875-bacb-5bfb5ff7c161","Type":"ContainerStarted","Data":"56a63e99653a80a876b71fa71aca2dd9d05733692b7ba92d42b45b6e79149cc9"} Mar 18 13:38:57 crc kubenswrapper[4921]: I0318 13:38:57.667100 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bclj2" event={"ID":"fc7b3690-bd67-4875-bacb-5bfb5ff7c161","Type":"ContainerStarted","Data":"91700bf2910961456598989ede2199665a872da3271931e1801c480b5967a878"} Mar 18 13:38:57 crc kubenswrapper[4921]: I0318 13:38:57.668701 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ee0-account-create-update-fg6v2" event={"ID":"7ecdf7f2-9bc5-494d-b67e-0bccbac89401","Type":"ContainerStarted","Data":"83be224397eb769822f6c508db1bb8357cf611c078710aa05d490bf6edfc1b5d"} Mar 18 13:38:57 crc kubenswrapper[4921]: I0318 13:38:57.668769 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ee0-account-create-update-fg6v2" event={"ID":"7ecdf7f2-9bc5-494d-b67e-0bccbac89401","Type":"ContainerStarted","Data":"d91a632be13250bd1d78f2e12ea723607ac31a126a7e64ff255377669176642d"} Mar 18 13:38:57 crc kubenswrapper[4921]: I0318 13:38:57.695667 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bclj2" podStartSLOduration=2.695641529 podStartE2EDuration="2.695641529s" podCreationTimestamp="2026-03-18 13:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:57.685237622 +0000 UTC m=+5357.235158271" watchObservedRunningTime="2026-03-18 13:38:57.695641529 +0000 UTC m=+5357.245562168" Mar 18 13:38:57 crc kubenswrapper[4921]: I0318 13:38:57.710618 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2ee0-account-create-update-fg6v2" podStartSLOduration=2.710592886 podStartE2EDuration="2.710592886s" podCreationTimestamp="2026-03-18 13:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:38:57.702974209 +0000 UTC m=+5357.252894848" watchObservedRunningTime="2026-03-18 13:38:57.710592886 +0000 UTC m=+5357.260513525" Mar 18 13:38:58 crc kubenswrapper[4921]: I0318 13:38:58.678852 4921 generic.go:334] "Generic (PLEG): container finished" podID="fc7b3690-bd67-4875-bacb-5bfb5ff7c161" containerID="56a63e99653a80a876b71fa71aca2dd9d05733692b7ba92d42b45b6e79149cc9" exitCode=0 Mar 18 13:38:58 crc kubenswrapper[4921]: I0318 13:38:58.678989 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bclj2" event={"ID":"fc7b3690-bd67-4875-bacb-5bfb5ff7c161","Type":"ContainerDied","Data":"56a63e99653a80a876b71fa71aca2dd9d05733692b7ba92d42b45b6e79149cc9"} Mar 18 13:38:59 crc kubenswrapper[4921]: I0318 13:38:59.688646 4921 generic.go:334] "Generic (PLEG): container finished" podID="7ecdf7f2-9bc5-494d-b67e-0bccbac89401" containerID="83be224397eb769822f6c508db1bb8357cf611c078710aa05d490bf6edfc1b5d" exitCode=0 Mar 18 13:38:59 crc kubenswrapper[4921]: I0318 13:38:59.688723 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ee0-account-create-update-fg6v2" event={"ID":"7ecdf7f2-9bc5-494d-b67e-0bccbac89401","Type":"ContainerDied","Data":"83be224397eb769822f6c508db1bb8357cf611c078710aa05d490bf6edfc1b5d"} Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.042377 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bclj2" Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.104234 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-operator-scripts\") pod \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.105129 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc7b3690-bd67-4875-bacb-5bfb5ff7c161" (UID: "fc7b3690-bd67-4875-bacb-5bfb5ff7c161"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.105208 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlv5g\" (UniqueName: \"kubernetes.io/projected/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-kube-api-access-xlv5g\") pod \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\" (UID: \"fc7b3690-bd67-4875-bacb-5bfb5ff7c161\") " Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.105504 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.113859 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-kube-api-access-xlv5g" (OuterVolumeSpecName: "kube-api-access-xlv5g") pod "fc7b3690-bd67-4875-bacb-5bfb5ff7c161" (UID: "fc7b3690-bd67-4875-bacb-5bfb5ff7c161"). InnerVolumeSpecName "kube-api-access-xlv5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.206343 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlv5g\" (UniqueName: \"kubernetes.io/projected/fc7b3690-bd67-4875-bacb-5bfb5ff7c161-kube-api-access-xlv5g\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.697637 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bclj2" event={"ID":"fc7b3690-bd67-4875-bacb-5bfb5ff7c161","Type":"ContainerDied","Data":"91700bf2910961456598989ede2199665a872da3271931e1801c480b5967a878"} Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.697725 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91700bf2910961456598989ede2199665a872da3271931e1801c480b5967a878" Mar 18 13:39:00 crc kubenswrapper[4921]: I0318 13:39:00.697660 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bclj2" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.132588 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.335175 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x76xb\" (UniqueName: \"kubernetes.io/projected/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-kube-api-access-x76xb\") pod \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.335311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-operator-scripts\") pod \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\" (UID: \"7ecdf7f2-9bc5-494d-b67e-0bccbac89401\") " Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.335868 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ecdf7f2-9bc5-494d-b67e-0bccbac89401" (UID: "7ecdf7f2-9bc5-494d-b67e-0bccbac89401"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.341097 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-kube-api-access-x76xb" (OuterVolumeSpecName: "kube-api-access-x76xb") pod "7ecdf7f2-9bc5-494d-b67e-0bccbac89401" (UID: "7ecdf7f2-9bc5-494d-b67e-0bccbac89401"). InnerVolumeSpecName "kube-api-access-x76xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.437943 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x76xb\" (UniqueName: \"kubernetes.io/projected/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-kube-api-access-x76xb\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.437998 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ecdf7f2-9bc5-494d-b67e-0bccbac89401-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.706419 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ee0-account-create-update-fg6v2" event={"ID":"7ecdf7f2-9bc5-494d-b67e-0bccbac89401","Type":"ContainerDied","Data":"d91a632be13250bd1d78f2e12ea723607ac31a126a7e64ff255377669176642d"} Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.706460 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d91a632be13250bd1d78f2e12ea723607ac31a126a7e64ff255377669176642d" Mar 18 13:39:01 crc kubenswrapper[4921]: I0318 13:39:01.706494 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ee0-account-create-update-fg6v2" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.447100 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jcl92"] Mar 18 13:39:06 crc kubenswrapper[4921]: E0318 13:39:06.447983 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecdf7f2-9bc5-494d-b67e-0bccbac89401" containerName="mariadb-account-create-update" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.447997 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecdf7f2-9bc5-494d-b67e-0bccbac89401" containerName="mariadb-account-create-update" Mar 18 13:39:06 crc kubenswrapper[4921]: E0318 13:39:06.448025 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7b3690-bd67-4875-bacb-5bfb5ff7c161" containerName="mariadb-database-create" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.448032 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7b3690-bd67-4875-bacb-5bfb5ff7c161" containerName="mariadb-database-create" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.448272 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecdf7f2-9bc5-494d-b67e-0bccbac89401" containerName="mariadb-account-create-update" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.448297 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7b3690-bd67-4875-bacb-5bfb5ff7c161" containerName="mariadb-database-create" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.448926 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.452458 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.452944 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.452941 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.453800 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vl2wb" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.456896 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jcl92"] Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.495944 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwzm\" (UniqueName: \"kubernetes.io/projected/72391d03-9818-4c13-8327-d63607cd54fa-kube-api-access-czwzm\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.496027 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-combined-ca-bundle\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.496394 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-config-data\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.598011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czwzm\" (UniqueName: \"kubernetes.io/projected/72391d03-9818-4c13-8327-d63607cd54fa-kube-api-access-czwzm\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.598065 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-combined-ca-bundle\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.598213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-config-data\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.605880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-combined-ca-bundle\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.606000 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-config-data\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.613236 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwzm\" (UniqueName: \"kubernetes.io/projected/72391d03-9818-4c13-8327-d63607cd54fa-kube-api-access-czwzm\") pod \"keystone-db-sync-jcl92\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:06 crc kubenswrapper[4921]: I0318 13:39:06.814379 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:07 crc kubenswrapper[4921]: I0318 13:39:07.337459 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jcl92"] Mar 18 13:39:07 crc kubenswrapper[4921]: I0318 13:39:07.760032 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jcl92" event={"ID":"72391d03-9818-4c13-8327-d63607cd54fa","Type":"ContainerStarted","Data":"7ab6d7e4c5f1e8d4501bc9049340b69dae819d320fc22a3125339511646e97c2"} Mar 18 13:39:07 crc kubenswrapper[4921]: I0318 13:39:07.760586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jcl92" event={"ID":"72391d03-9818-4c13-8327-d63607cd54fa","Type":"ContainerStarted","Data":"a797cfa8fc606cbe3d4b5816956d2f7a484a9ab53d1ee7fcec71b98230065715"} Mar 18 13:39:07 crc kubenswrapper[4921]: I0318 13:39:07.784271 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jcl92" podStartSLOduration=1.784244754 podStartE2EDuration="1.784244754s" podCreationTimestamp="2026-03-18 13:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:39:07.779933051 +0000 UTC m=+5367.329853690" watchObservedRunningTime="2026-03-18 13:39:07.784244754 +0000 UTC m=+5367.334165393" Mar 18 13:39:09 crc kubenswrapper[4921]: I0318 13:39:09.779063 4921 generic.go:334] "Generic (PLEG): container finished" podID="72391d03-9818-4c13-8327-d63607cd54fa" containerID="7ab6d7e4c5f1e8d4501bc9049340b69dae819d320fc22a3125339511646e97c2" exitCode=0 Mar 18 13:39:09 crc kubenswrapper[4921]: I0318 13:39:09.779149 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jcl92" event={"ID":"72391d03-9818-4c13-8327-d63607cd54fa","Type":"ContainerDied","Data":"7ab6d7e4c5f1e8d4501bc9049340b69dae819d320fc22a3125339511646e97c2"} Mar 18 13:39:10 crc kubenswrapper[4921]: I0318 13:39:10.723605 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.117654 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.284057 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-combined-ca-bundle\") pod \"72391d03-9818-4c13-8327-d63607cd54fa\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.284693 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czwzm\" (UniqueName: \"kubernetes.io/projected/72391d03-9818-4c13-8327-d63607cd54fa-kube-api-access-czwzm\") pod \"72391d03-9818-4c13-8327-d63607cd54fa\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.284851 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-config-data\") pod \"72391d03-9818-4c13-8327-d63607cd54fa\" (UID: \"72391d03-9818-4c13-8327-d63607cd54fa\") " Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.298346 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72391d03-9818-4c13-8327-d63607cd54fa-kube-api-access-czwzm" (OuterVolumeSpecName: "kube-api-access-czwzm") pod "72391d03-9818-4c13-8327-d63607cd54fa" (UID: "72391d03-9818-4c13-8327-d63607cd54fa"). InnerVolumeSpecName "kube-api-access-czwzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.321297 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72391d03-9818-4c13-8327-d63607cd54fa" (UID: "72391d03-9818-4c13-8327-d63607cd54fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.337878 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-config-data" (OuterVolumeSpecName: "config-data") pod "72391d03-9818-4c13-8327-d63607cd54fa" (UID: "72391d03-9818-4c13-8327-d63607cd54fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.393781 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.393831 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czwzm\" (UniqueName: \"kubernetes.io/projected/72391d03-9818-4c13-8327-d63607cd54fa-kube-api-access-czwzm\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.393846 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72391d03-9818-4c13-8327-d63607cd54fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.795694 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jcl92" event={"ID":"72391d03-9818-4c13-8327-d63607cd54fa","Type":"ContainerDied","Data":"a797cfa8fc606cbe3d4b5816956d2f7a484a9ab53d1ee7fcec71b98230065715"} Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.795959 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a797cfa8fc606cbe3d4b5816956d2f7a484a9ab53d1ee7fcec71b98230065715" Mar 18 13:39:11 crc kubenswrapper[4921]: I0318 13:39:11.795846 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jcl92" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.021594 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s5rvr"] Mar 18 13:39:12 crc kubenswrapper[4921]: E0318 13:39:12.021967 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72391d03-9818-4c13-8327-d63607cd54fa" containerName="keystone-db-sync" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.021986 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="72391d03-9818-4c13-8327-d63607cd54fa" containerName="keystone-db-sync" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.022186 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="72391d03-9818-4c13-8327-d63607cd54fa" containerName="keystone-db-sync" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.022761 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.028666 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.028716 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.028775 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.028716 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vl2wb" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.030601 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.044847 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s5rvr"] Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.066445 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-j2s62"] Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.067789 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.085342 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-j2s62"] Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.107194 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-config-data\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.107300 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-credential-keys\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.107417 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-combined-ca-bundle\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.107452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-scripts\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.107479 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7x5\" (UniqueName: \"kubernetes.io/projected/bfeb191e-9d06-446f-8707-803a57ecb8c8-kube-api-access-9t7x5\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.108207 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-fernet-keys\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209482 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-scripts\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209533 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-sb\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209557 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t7x5\" (UniqueName: \"kubernetes.io/projected/bfeb191e-9d06-446f-8707-803a57ecb8c8-kube-api-access-9t7x5\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209622 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-fernet-keys\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209656 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-nb\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209677 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krp6g\" (UniqueName: \"kubernetes.io/projected/1964d097-3a65-4efd-bd58-16e101b36d1d-kube-api-access-krp6g\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209710 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-config-data\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.209943 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-credential-keys\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.210093 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-dns-svc\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.210205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-config\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.210311 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-combined-ca-bundle\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.214467 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-combined-ca-bundle\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.214521 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-scripts\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.216509 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-config-data\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.227877 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-fernet-keys\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.229750 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t7x5\" (UniqueName: \"kubernetes.io/projected/bfeb191e-9d06-446f-8707-803a57ecb8c8-kube-api-access-9t7x5\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.233791 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-credential-keys\") pod \"keystone-bootstrap-s5rvr\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.312012 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-nb\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.312094 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krp6g\" (UniqueName: \"kubernetes.io/projected/1964d097-3a65-4efd-bd58-16e101b36d1d-kube-api-access-krp6g\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.312467 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-dns-svc\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.312512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-config\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.312624 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-sb\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.313146 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-nb\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.313397 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-dns-svc\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.313673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-config\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.314085 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-sb\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.338002 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krp6g\" (UniqueName: \"kubernetes.io/projected/1964d097-3a65-4efd-bd58-16e101b36d1d-kube-api-access-krp6g\") pod \"dnsmasq-dns-7485969d9c-j2s62\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.353754 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.392434 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.828187 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s5rvr"] Mar 18 13:39:12 crc kubenswrapper[4921]: I0318 13:39:12.934957 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-j2s62"] Mar 18 13:39:12 crc kubenswrapper[4921]: W0318 13:39:12.955224 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1964d097_3a65_4efd_bd58_16e101b36d1d.slice/crio-59e8d35a23902adc56021e266537b333b7e16918f48849f4be4d145f1b873955 WatchSource:0}: Error finding container 59e8d35a23902adc56021e266537b333b7e16918f48849f4be4d145f1b873955: Status 404 returned error can't find the container with id 59e8d35a23902adc56021e266537b333b7e16918f48849f4be4d145f1b873955 Mar 18 13:39:13 crc kubenswrapper[4921]: I0318 13:39:13.814395 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s5rvr" event={"ID":"bfeb191e-9d06-446f-8707-803a57ecb8c8","Type":"ContainerStarted","Data":"9d505edab099638347c83892a468b29ddbeed91243f052b66bac3c2ef60263fa"} Mar 18 13:39:13 crc kubenswrapper[4921]: I0318 13:39:13.814452 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s5rvr" event={"ID":"bfeb191e-9d06-446f-8707-803a57ecb8c8","Type":"ContainerStarted","Data":"0122c7e3e67dd697ebc8cee822c8c5500449627ae31eaa932eb6eb1ec648d579"} Mar 18 13:39:13 crc kubenswrapper[4921]: I0318 13:39:13.817154 4921 generic.go:334] "Generic (PLEG): container finished" podID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerID="a20f2c21d44e2fcd810c9a1abe5f39676c6a57c2100d3a42e8ce72e9d09468c4" exitCode=0 Mar 18 13:39:13 crc kubenswrapper[4921]: I0318 13:39:13.817193 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" event={"ID":"1964d097-3a65-4efd-bd58-16e101b36d1d","Type":"ContainerDied","Data":"a20f2c21d44e2fcd810c9a1abe5f39676c6a57c2100d3a42e8ce72e9d09468c4"} Mar 18 13:39:13 crc kubenswrapper[4921]: I0318 13:39:13.817212 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" event={"ID":"1964d097-3a65-4efd-bd58-16e101b36d1d","Type":"ContainerStarted","Data":"59e8d35a23902adc56021e266537b333b7e16918f48849f4be4d145f1b873955"} Mar 18 13:39:13 crc kubenswrapper[4921]: I0318 13:39:13.859938 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s5rvr" podStartSLOduration=2.859913604 podStartE2EDuration="2.859913604s" podCreationTimestamp="2026-03-18 13:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:39:13.853336796 +0000 UTC m=+5373.403257455" watchObservedRunningTime="2026-03-18 13:39:13.859913604 +0000 UTC m=+5373.409834243" Mar 18 13:39:14 crc kubenswrapper[4921]: I0318 13:39:14.826402 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" event={"ID":"1964d097-3a65-4efd-bd58-16e101b36d1d","Type":"ContainerStarted","Data":"6034c34302167f7e6fe51f3e0584d14fcd75468f0ac96ae3cc2e61ee06721561"} Mar 18 13:39:14 crc kubenswrapper[4921]: I0318 13:39:14.827514 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:14 crc kubenswrapper[4921]: I0318 13:39:14.848215 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" podStartSLOduration=2.848195688 podStartE2EDuration="2.848195688s" podCreationTimestamp="2026-03-18 13:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:39:14.843486204 +0000 UTC m=+5374.393406863" watchObservedRunningTime="2026-03-18 13:39:14.848195688 +0000 UTC m=+5374.398116327" Mar 18 13:39:16 crc kubenswrapper[4921]: I0318 13:39:16.842870 4921 generic.go:334] "Generic (PLEG): container finished" podID="bfeb191e-9d06-446f-8707-803a57ecb8c8" containerID="9d505edab099638347c83892a468b29ddbeed91243f052b66bac3c2ef60263fa" exitCode=0 Mar 18 13:39:16 crc kubenswrapper[4921]: I0318 13:39:16.842991 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s5rvr" event={"ID":"bfeb191e-9d06-446f-8707-803a57ecb8c8","Type":"ContainerDied","Data":"9d505edab099638347c83892a468b29ddbeed91243f052b66bac3c2ef60263fa"} Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.218057 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.247532 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-fernet-keys\") pod \"bfeb191e-9d06-446f-8707-803a57ecb8c8\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.247839 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-combined-ca-bundle\") pod \"bfeb191e-9d06-446f-8707-803a57ecb8c8\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.248141 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-scripts\") pod \"bfeb191e-9d06-446f-8707-803a57ecb8c8\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.248281 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t7x5\" (UniqueName: \"kubernetes.io/projected/bfeb191e-9d06-446f-8707-803a57ecb8c8-kube-api-access-9t7x5\") pod \"bfeb191e-9d06-446f-8707-803a57ecb8c8\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.248406 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-config-data\") pod \"bfeb191e-9d06-446f-8707-803a57ecb8c8\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.248562 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-credential-keys\") pod \"bfeb191e-9d06-446f-8707-803a57ecb8c8\" (UID: \"bfeb191e-9d06-446f-8707-803a57ecb8c8\") " Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.256974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bfeb191e-9d06-446f-8707-803a57ecb8c8" (UID: "bfeb191e-9d06-446f-8707-803a57ecb8c8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.258722 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfeb191e-9d06-446f-8707-803a57ecb8c8-kube-api-access-9t7x5" (OuterVolumeSpecName: "kube-api-access-9t7x5") pod "bfeb191e-9d06-446f-8707-803a57ecb8c8" (UID: "bfeb191e-9d06-446f-8707-803a57ecb8c8"). InnerVolumeSpecName "kube-api-access-9t7x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.261461 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bfeb191e-9d06-446f-8707-803a57ecb8c8" (UID: "bfeb191e-9d06-446f-8707-803a57ecb8c8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.263431 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-scripts" (OuterVolumeSpecName: "scripts") pod "bfeb191e-9d06-446f-8707-803a57ecb8c8" (UID: "bfeb191e-9d06-446f-8707-803a57ecb8c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.278999 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-config-data" (OuterVolumeSpecName: "config-data") pod "bfeb191e-9d06-446f-8707-803a57ecb8c8" (UID: "bfeb191e-9d06-446f-8707-803a57ecb8c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.281643 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfeb191e-9d06-446f-8707-803a57ecb8c8" (UID: "bfeb191e-9d06-446f-8707-803a57ecb8c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.351080 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.351124 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t7x5\" (UniqueName: \"kubernetes.io/projected/bfeb191e-9d06-446f-8707-803a57ecb8c8-kube-api-access-9t7x5\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.351137 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.351159 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.351172 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.351184 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfeb191e-9d06-446f-8707-803a57ecb8c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.863990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s5rvr" event={"ID":"bfeb191e-9d06-446f-8707-803a57ecb8c8","Type":"ContainerDied","Data":"0122c7e3e67dd697ebc8cee822c8c5500449627ae31eaa932eb6eb1ec648d579"} Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.864035 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0122c7e3e67dd697ebc8cee822c8c5500449627ae31eaa932eb6eb1ec648d579" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.864097 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s5rvr" Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.961952 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s5rvr"] Mar 18 13:39:18 crc kubenswrapper[4921]: I0318 13:39:18.990526 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s5rvr"] Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.048200 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-54nxb"] Mar 18 13:39:19 crc kubenswrapper[4921]: E0318 13:39:19.048616 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfeb191e-9d06-446f-8707-803a57ecb8c8" containerName="keystone-bootstrap" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.048640 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfeb191e-9d06-446f-8707-803a57ecb8c8" containerName="keystone-bootstrap" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.048858 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfeb191e-9d06-446f-8707-803a57ecb8c8" containerName="keystone-bootstrap" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.049535 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.051768 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.051833 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vl2wb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.051979 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.052140 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.053607 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.062296 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-54nxb"] Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.064745 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-config-data\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.064925 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-credential-keys\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.065025 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-combined-ca-bundle\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.065163 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-scripts\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.065277 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wrpc\" (UniqueName: \"kubernetes.io/projected/a68984fe-0046-43ea-b5f1-2809a85ff847-kube-api-access-4wrpc\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.065474 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-fernet-keys\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.167380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-credential-keys\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.167728 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-combined-ca-bundle\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.167774 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-scripts\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.167806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wrpc\" (UniqueName: \"kubernetes.io/projected/a68984fe-0046-43ea-b5f1-2809a85ff847-kube-api-access-4wrpc\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.167839 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-fernet-keys\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.167886 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-config-data\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.171730 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-scripts\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.172798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-credential-keys\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.172937 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-combined-ca-bundle\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.173266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-config-data\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.174281 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-fernet-keys\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.185872 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wrpc\" (UniqueName: \"kubernetes.io/projected/a68984fe-0046-43ea-b5f1-2809a85ff847-kube-api-access-4wrpc\") pod \"keystone-bootstrap-54nxb\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.220373 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfeb191e-9d06-446f-8707-803a57ecb8c8" path="/var/lib/kubelet/pods/bfeb191e-9d06-446f-8707-803a57ecb8c8/volumes" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.366529 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.800994 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-54nxb"] Mar 18 13:39:19 crc kubenswrapper[4921]: I0318 13:39:19.877342 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-54nxb" event={"ID":"a68984fe-0046-43ea-b5f1-2809a85ff847","Type":"ContainerStarted","Data":"76632152e1452930698fd7cf03f179a2d776874587501988634008d40bd4ea38"} Mar 18 13:39:20 crc kubenswrapper[4921]: I0318 13:39:20.888267 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-54nxb" event={"ID":"a68984fe-0046-43ea-b5f1-2809a85ff847","Type":"ContainerStarted","Data":"c6fb3d072a73a5df06123b12eef9cb41babe6480784877a459075862621412b7"} Mar 18 13:39:20 crc kubenswrapper[4921]: I0318 13:39:20.919615 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-54nxb" podStartSLOduration=1.919577527 podStartE2EDuration="1.919577527s" podCreationTimestamp="2026-03-18 13:39:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:39:20.912966558 +0000 UTC m=+5380.462887227" watchObservedRunningTime="2026-03-18 13:39:20.919577527 +0000 UTC m=+5380.469498206" Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.394295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.451938 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-tzx4q"] Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.452376 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerName="dnsmasq-dns" containerID="cri-o://2b763669d19e3e400b68fdce97ae2c185f2aee49c7f398fb45f1f875b0f3bd8a" gracePeriod=10 Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.907908 4921 generic.go:334] "Generic (PLEG): container finished" podID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerID="2b763669d19e3e400b68fdce97ae2c185f2aee49c7f398fb45f1f875b0f3bd8a" exitCode=0 Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.908588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" event={"ID":"94ba0bd3-9942-4de3-9a07-3628472692ec","Type":"ContainerDied","Data":"2b763669d19e3e400b68fdce97ae2c185f2aee49c7f398fb45f1f875b0f3bd8a"} Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.908636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" event={"ID":"94ba0bd3-9942-4de3-9a07-3628472692ec","Type":"ContainerDied","Data":"7f46ee73bef2932573ef7d243be79a60c643fadf641934894e7b10d7f74ffd54"} Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.908650 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f46ee73bef2932573ef7d243be79a60c643fadf641934894e7b10d7f74ffd54" Mar 18 13:39:22 crc kubenswrapper[4921]: I0318 13:39:22.975962 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.136777 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-dns-svc\") pod \"94ba0bd3-9942-4de3-9a07-3628472692ec\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.137275 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-sb\") pod \"94ba0bd3-9942-4de3-9a07-3628472692ec\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.137315 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-config\") pod \"94ba0bd3-9942-4de3-9a07-3628472692ec\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.137342 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j49mf\" (UniqueName: \"kubernetes.io/projected/94ba0bd3-9942-4de3-9a07-3628472692ec-kube-api-access-j49mf\") pod \"94ba0bd3-9942-4de3-9a07-3628472692ec\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.137364 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-nb\") pod \"94ba0bd3-9942-4de3-9a07-3628472692ec\" (UID: \"94ba0bd3-9942-4de3-9a07-3628472692ec\") " Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.143549 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ba0bd3-9942-4de3-9a07-3628472692ec-kube-api-access-j49mf" (OuterVolumeSpecName: "kube-api-access-j49mf") pod "94ba0bd3-9942-4de3-9a07-3628472692ec" (UID: "94ba0bd3-9942-4de3-9a07-3628472692ec"). InnerVolumeSpecName "kube-api-access-j49mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.179209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94ba0bd3-9942-4de3-9a07-3628472692ec" (UID: "94ba0bd3-9942-4de3-9a07-3628472692ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.180594 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "94ba0bd3-9942-4de3-9a07-3628472692ec" (UID: "94ba0bd3-9942-4de3-9a07-3628472692ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.180673 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94ba0bd3-9942-4de3-9a07-3628472692ec" (UID: "94ba0bd3-9942-4de3-9a07-3628472692ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.189616 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-config" (OuterVolumeSpecName: "config") pod "94ba0bd3-9942-4de3-9a07-3628472692ec" (UID: "94ba0bd3-9942-4de3-9a07-3628472692ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.239815 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.239863 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.239877 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j49mf\" (UniqueName: \"kubernetes.io/projected/94ba0bd3-9942-4de3-9a07-3628472692ec-kube-api-access-j49mf\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.239893 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.239904 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94ba0bd3-9942-4de3-9a07-3628472692ec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.918541 4921 generic.go:334] "Generic (PLEG): container finished" podID="a68984fe-0046-43ea-b5f1-2809a85ff847" containerID="c6fb3d072a73a5df06123b12eef9cb41babe6480784877a459075862621412b7" exitCode=0 Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.918605 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-54nxb" event={"ID":"a68984fe-0046-43ea-b5f1-2809a85ff847","Type":"ContainerDied","Data":"c6fb3d072a73a5df06123b12eef9cb41babe6480784877a459075862621412b7"} Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.918661 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c54468fdc-tzx4q" Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.970078 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-tzx4q"] Mar 18 13:39:23 crc kubenswrapper[4921]: I0318 13:39:23.978379 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c54468fdc-tzx4q"] Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.221163 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" path="/var/lib/kubelet/pods/94ba0bd3-9942-4de3-9a07-3628472692ec/volumes" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.334368 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.482728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-fernet-keys\") pod \"a68984fe-0046-43ea-b5f1-2809a85ff847\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.482826 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wrpc\" (UniqueName: \"kubernetes.io/projected/a68984fe-0046-43ea-b5f1-2809a85ff847-kube-api-access-4wrpc\") pod \"a68984fe-0046-43ea-b5f1-2809a85ff847\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.482858 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-config-data\") pod \"a68984fe-0046-43ea-b5f1-2809a85ff847\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.482962 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-combined-ca-bundle\") pod \"a68984fe-0046-43ea-b5f1-2809a85ff847\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.483046 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-credential-keys\") pod \"a68984fe-0046-43ea-b5f1-2809a85ff847\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.483154 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-scripts\") pod \"a68984fe-0046-43ea-b5f1-2809a85ff847\" (UID: \"a68984fe-0046-43ea-b5f1-2809a85ff847\") " Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.489507 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a68984fe-0046-43ea-b5f1-2809a85ff847" (UID: "a68984fe-0046-43ea-b5f1-2809a85ff847"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.502551 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a68984fe-0046-43ea-b5f1-2809a85ff847" (UID: "a68984fe-0046-43ea-b5f1-2809a85ff847"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.502936 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68984fe-0046-43ea-b5f1-2809a85ff847-kube-api-access-4wrpc" (OuterVolumeSpecName: "kube-api-access-4wrpc") pod "a68984fe-0046-43ea-b5f1-2809a85ff847" (UID: "a68984fe-0046-43ea-b5f1-2809a85ff847"). InnerVolumeSpecName "kube-api-access-4wrpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.504908 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-scripts" (OuterVolumeSpecName: "scripts") pod "a68984fe-0046-43ea-b5f1-2809a85ff847" (UID: "a68984fe-0046-43ea-b5f1-2809a85ff847"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.509240 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a68984fe-0046-43ea-b5f1-2809a85ff847" (UID: "a68984fe-0046-43ea-b5f1-2809a85ff847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.513329 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-config-data" (OuterVolumeSpecName: "config-data") pod "a68984fe-0046-43ea-b5f1-2809a85ff847" (UID: "a68984fe-0046-43ea-b5f1-2809a85ff847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.585558 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.585604 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.585621 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wrpc\" (UniqueName: \"kubernetes.io/projected/a68984fe-0046-43ea-b5f1-2809a85ff847-kube-api-access-4wrpc\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.585637 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.585648 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.585658 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a68984fe-0046-43ea-b5f1-2809a85ff847-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.936893 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-54nxb" event={"ID":"a68984fe-0046-43ea-b5f1-2809a85ff847","Type":"ContainerDied","Data":"76632152e1452930698fd7cf03f179a2d776874587501988634008d40bd4ea38"} Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.937237 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76632152e1452930698fd7cf03f179a2d776874587501988634008d40bd4ea38" Mar 18 13:39:25 crc kubenswrapper[4921]: I0318 13:39:25.936950 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-54nxb" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.034768 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-758655498b-jd8n6"] Mar 18 13:39:26 crc kubenswrapper[4921]: E0318 13:39:26.035245 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68984fe-0046-43ea-b5f1-2809a85ff847" containerName="keystone-bootstrap" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.035273 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68984fe-0046-43ea-b5f1-2809a85ff847" containerName="keystone-bootstrap" Mar 18 13:39:26 crc kubenswrapper[4921]: E0318 13:39:26.035307 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerName="init" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.035318 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerName="init" Mar 18 13:39:26 crc kubenswrapper[4921]: E0318 13:39:26.035339 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerName="dnsmasq-dns" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.035347 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerName="dnsmasq-dns" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.035541 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ba0bd3-9942-4de3-9a07-3628472692ec" containerName="dnsmasq-dns" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.035577 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68984fe-0046-43ea-b5f1-2809a85ff847" containerName="keystone-bootstrap" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.036306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.038315 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.039002 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.039184 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vl2wb" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.039345 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.055493 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-758655498b-jd8n6"] Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.095413 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-credential-keys\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.095491 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-config-data\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.095513 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-scripts\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.095565 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdpz7\" (UniqueName: \"kubernetes.io/projected/074a690a-ebc7-418c-afb0-abbab8c638d7-kube-api-access-rdpz7\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.095593 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-fernet-keys\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.095638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-combined-ca-bundle\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.197996 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-config-data\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.198170 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-scripts\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.198303 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdpz7\" (UniqueName: \"kubernetes.io/projected/074a690a-ebc7-418c-afb0-abbab8c638d7-kube-api-access-rdpz7\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.198344 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-fernet-keys\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.198467 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-combined-ca-bundle\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.198649 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-credential-keys\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.202914 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-scripts\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.203783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-combined-ca-bundle\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.204464 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-fernet-keys\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.205324 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-config-data\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.205327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/074a690a-ebc7-418c-afb0-abbab8c638d7-credential-keys\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.223020 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdpz7\" (UniqueName: \"kubernetes.io/projected/074a690a-ebc7-418c-afb0-abbab8c638d7-kube-api-access-rdpz7\") pod \"keystone-758655498b-jd8n6\" (UID: \"074a690a-ebc7-418c-afb0-abbab8c638d7\") " pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.375302 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.829211 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-758655498b-jd8n6"] Mar 18 13:39:26 crc kubenswrapper[4921]: I0318 13:39:26.953906 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-758655498b-jd8n6" event={"ID":"074a690a-ebc7-418c-afb0-abbab8c638d7","Type":"ContainerStarted","Data":"9c22003741adbb0d23b254216bd0bf91782e63d42d31d38a297ee195f181fba1"} Mar 18 13:39:27 crc kubenswrapper[4921]: I0318 13:39:27.964271 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-758655498b-jd8n6" event={"ID":"074a690a-ebc7-418c-afb0-abbab8c638d7","Type":"ContainerStarted","Data":"a967d70071df25b7f5ea2456ba74645352f08a6c5740c303aca8fa5adf34b55d"} Mar 18 13:39:27 crc kubenswrapper[4921]: I0318 13:39:27.964682 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:39:27 crc kubenswrapper[4921]: I0318 13:39:27.986712 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-758655498b-jd8n6" podStartSLOduration=1.986688632 podStartE2EDuration="1.986688632s" podCreationTimestamp="2026-03-18 13:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:39:27.986230149 +0000 UTC m=+5387.536150798" watchObservedRunningTime="2026-03-18 13:39:27.986688632 +0000 UTC m=+5387.536609301" Mar 18 13:39:58 crc kubenswrapper[4921]: I0318 13:39:58.132778 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-758655498b-jd8n6" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.172377 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564020-2q92n"] Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.174291 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.178197 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.178242 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.179225 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.182437 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-2q92n"] Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.307993 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4tv\" (UniqueName: \"kubernetes.io/projected/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a-kube-api-access-6f4tv\") pod \"auto-csr-approver-29564020-2q92n\" (UID: \"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a\") " pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.411237 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4tv\" (UniqueName: \"kubernetes.io/projected/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a-kube-api-access-6f4tv\") pod \"auto-csr-approver-29564020-2q92n\" (UID: \"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a\") " pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.437825 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4tv\" (UniqueName: \"kubernetes.io/projected/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a-kube-api-access-6f4tv\") pod \"auto-csr-approver-29564020-2q92n\" (UID: \"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a\") " pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.499299 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.761788 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.768035 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.772224 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.772588 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jhssz" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.776033 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.816041 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.828222 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.837458 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-2q92n"] Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.926686 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config-secret\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.926798 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:00 crc kubenswrapper[4921]: I0318 13:40:00.926863 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpx7\" (UniqueName: \"kubernetes.io/projected/58256fdc-88b6-4354-a86c-17dc1aebba44-kube-api-access-ndpx7\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.028756 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpx7\" (UniqueName: \"kubernetes.io/projected/58256fdc-88b6-4354-a86c-17dc1aebba44-kube-api-access-ndpx7\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.029300 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config-secret\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.029475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.030779 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.035777 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config-secret\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.047028 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpx7\" (UniqueName: \"kubernetes.io/projected/58256fdc-88b6-4354-a86c-17dc1aebba44-kube-api-access-ndpx7\") pod \"openstackclient\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.153634 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.269131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-2q92n" event={"ID":"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a","Type":"ContainerStarted","Data":"06f868ca9018b5034e14ebf8e5cbfebdaf4f659ca336d6e491b9f277ac264eec"} Mar 18 13:40:01 crc kubenswrapper[4921]: I0318 13:40:01.654653 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 13:40:01 crc kubenswrapper[4921]: W0318 13:40:01.658346 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58256fdc_88b6_4354_a86c_17dc1aebba44.slice/crio-faf9b5f93db439d280ce4d24071de0f50839a84b8dddf402ee87942f87c4f28e WatchSource:0}: Error finding container faf9b5f93db439d280ce4d24071de0f50839a84b8dddf402ee87942f87c4f28e: Status 404 returned error can't find the container with id faf9b5f93db439d280ce4d24071de0f50839a84b8dddf402ee87942f87c4f28e Mar 18 13:40:02 crc kubenswrapper[4921]: I0318 13:40:02.279823 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"58256fdc-88b6-4354-a86c-17dc1aebba44","Type":"ContainerStarted","Data":"da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa"} Mar 18 13:40:02 crc kubenswrapper[4921]: I0318 13:40:02.280261 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"58256fdc-88b6-4354-a86c-17dc1aebba44","Type":"ContainerStarted","Data":"faf9b5f93db439d280ce4d24071de0f50839a84b8dddf402ee87942f87c4f28e"} Mar 18 13:40:02 crc kubenswrapper[4921]: I0318 13:40:02.300981 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.300951659 podStartE2EDuration="2.300951659s" podCreationTimestamp="2026-03-18 13:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:40:02.296361068 +0000 UTC m=+5421.846281707" watchObservedRunningTime="2026-03-18 13:40:02.300951659 +0000 UTC m=+5421.850872298" Mar 18 13:40:03 crc kubenswrapper[4921]: I0318 13:40:03.289315 4921 generic.go:334] "Generic (PLEG): container finished" podID="dd58bd96-9f87-4909-b9aa-2cb0fc619b6a" containerID="b74bdd4f2d85a3fc5bd6360df807a43f16e16fd4720dc30fb44e82a94cd95aa3" exitCode=0 Mar 18 13:40:03 crc kubenswrapper[4921]: I0318 13:40:03.289406 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-2q92n" event={"ID":"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a","Type":"ContainerDied","Data":"b74bdd4f2d85a3fc5bd6360df807a43f16e16fd4720dc30fb44e82a94cd95aa3"} Mar 18 13:40:04 crc kubenswrapper[4921]: I0318 13:40:04.623522 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:04 crc kubenswrapper[4921]: I0318 13:40:04.701947 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f4tv\" (UniqueName: \"kubernetes.io/projected/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a-kube-api-access-6f4tv\") pod \"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a\" (UID: \"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a\") " Mar 18 13:40:04 crc kubenswrapper[4921]: I0318 13:40:04.709688 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a-kube-api-access-6f4tv" (OuterVolumeSpecName: "kube-api-access-6f4tv") pod "dd58bd96-9f87-4909-b9aa-2cb0fc619b6a" (UID: "dd58bd96-9f87-4909-b9aa-2cb0fc619b6a"). InnerVolumeSpecName "kube-api-access-6f4tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:04 crc kubenswrapper[4921]: I0318 13:40:04.805419 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f4tv\" (UniqueName: \"kubernetes.io/projected/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a-kube-api-access-6f4tv\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:05 crc kubenswrapper[4921]: I0318 13:40:05.311042 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-2q92n" event={"ID":"dd58bd96-9f87-4909-b9aa-2cb0fc619b6a","Type":"ContainerDied","Data":"06f868ca9018b5034e14ebf8e5cbfebdaf4f659ca336d6e491b9f277ac264eec"} Mar 18 13:40:05 crc kubenswrapper[4921]: I0318 13:40:05.311337 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06f868ca9018b5034e14ebf8e5cbfebdaf4f659ca336d6e491b9f277ac264eec" Mar 18 13:40:05 crc kubenswrapper[4921]: I0318 13:40:05.311094 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-2q92n" Mar 18 13:40:05 crc kubenswrapper[4921]: I0318 13:40:05.716291 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-xjq4t"] Mar 18 13:40:05 crc kubenswrapper[4921]: I0318 13:40:05.724585 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-xjq4t"] Mar 18 13:40:07 crc kubenswrapper[4921]: I0318 13:40:07.223421 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb3bba4-e4ce-47af-aec9-495dc8de6cde" path="/var/lib/kubelet/pods/5bb3bba4-e4ce-47af-aec9-495dc8de6cde/volumes" Mar 18 13:40:07 crc kubenswrapper[4921]: I0318 13:40:07.875798 4921 scope.go:117] "RemoveContainer" containerID="cbc676dfaad8711a3030141aec29f84fb4465d674115228622d662c85f6b370d" Mar 18 13:40:17 crc kubenswrapper[4921]: I0318 13:40:17.081663 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:40:17 crc kubenswrapper[4921]: I0318 13:40:17.083265 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:40:47 crc kubenswrapper[4921]: I0318 13:40:47.081494 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:40:47 crc kubenswrapper[4921]: I0318 13:40:47.082423 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:41:09 crc kubenswrapper[4921]: I0318 13:41:09.060084 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8m58f"] Mar 18 13:41:09 crc kubenswrapper[4921]: I0318 13:41:09.070430 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8m58f"] Mar 18 13:41:09 crc kubenswrapper[4921]: I0318 13:41:09.224080 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41abfa9f-2f52-410c-858f-97cc71b3a8c0" path="/var/lib/kubelet/pods/41abfa9f-2f52-410c-858f-97cc71b3a8c0/volumes" Mar 18 13:41:17 crc kubenswrapper[4921]: I0318 13:41:17.081335 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:41:17 crc kubenswrapper[4921]: I0318 13:41:17.082217 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:41:17 crc kubenswrapper[4921]: I0318 13:41:17.082268 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:41:17 crc kubenswrapper[4921]: I0318 13:41:17.083076 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"925f41448af3066f1f8956601297f08a71fa65c35e3c953610bfb65d6cea1e9b"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:41:17 crc kubenswrapper[4921]: I0318 13:41:17.083167 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://925f41448af3066f1f8956601297f08a71fa65c35e3c953610bfb65d6cea1e9b" gracePeriod=600 Mar 18 13:41:18 crc kubenswrapper[4921]: I0318 13:41:18.001872 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="925f41448af3066f1f8956601297f08a71fa65c35e3c953610bfb65d6cea1e9b" exitCode=0 Mar 18 13:41:18 crc kubenswrapper[4921]: I0318 13:41:18.001948 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"925f41448af3066f1f8956601297f08a71fa65c35e3c953610bfb65d6cea1e9b"} Mar 18 13:41:18 crc kubenswrapper[4921]: I0318 13:41:18.002668 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d"} Mar 18 13:41:18 crc kubenswrapper[4921]: I0318 13:41:18.002708 4921 scope.go:117] "RemoveContainer" containerID="f52b00069903a06bb1c403f32b112d99c3580f358080d0a2dab07b4bd69d713d" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.063003 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-p9ntc"] Mar 18 13:41:46 crc kubenswrapper[4921]: E0318 13:41:46.063967 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd58bd96-9f87-4909-b9aa-2cb0fc619b6a" containerName="oc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.063984 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd58bd96-9f87-4909-b9aa-2cb0fc619b6a" containerName="oc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.064203 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd58bd96-9f87-4909-b9aa-2cb0fc619b6a" containerName="oc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.064787 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.076498 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-p9ntc"] Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.160944 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-dfce-account-create-update-mkhgn"] Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.161987 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.163875 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.178489 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dfce-account-create-update-mkhgn"] Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.188027 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hhg\" (UniqueName: \"kubernetes.io/projected/5c79fa37-ae65-4e13-a045-1e998aac23ce-kube-api-access-l7hhg\") pod \"barbican-db-create-p9ntc\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.188135 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c79fa37-ae65-4e13-a045-1e998aac23ce-operator-scripts\") pod \"barbican-db-create-p9ntc\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.289744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b3eb96-3132-4a26-bb74-45dc59627711-operator-scripts\") pod \"barbican-dfce-account-create-update-mkhgn\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.297372 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hhg\" (UniqueName: \"kubernetes.io/projected/5c79fa37-ae65-4e13-a045-1e998aac23ce-kube-api-access-l7hhg\") pod \"barbican-db-create-p9ntc\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.297640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c79fa37-ae65-4e13-a045-1e998aac23ce-operator-scripts\") pod \"barbican-db-create-p9ntc\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.297775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b864p\" (UniqueName: \"kubernetes.io/projected/33b3eb96-3132-4a26-bb74-45dc59627711-kube-api-access-b864p\") pod \"barbican-dfce-account-create-update-mkhgn\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.298552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c79fa37-ae65-4e13-a045-1e998aac23ce-operator-scripts\") pod \"barbican-db-create-p9ntc\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.316744 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hhg\" (UniqueName: \"kubernetes.io/projected/5c79fa37-ae65-4e13-a045-1e998aac23ce-kube-api-access-l7hhg\") pod \"barbican-db-create-p9ntc\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.383692 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.401561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b864p\" (UniqueName: \"kubernetes.io/projected/33b3eb96-3132-4a26-bb74-45dc59627711-kube-api-access-b864p\") pod \"barbican-dfce-account-create-update-mkhgn\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.401644 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b3eb96-3132-4a26-bb74-45dc59627711-operator-scripts\") pod \"barbican-dfce-account-create-update-mkhgn\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.403348 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b3eb96-3132-4a26-bb74-45dc59627711-operator-scripts\") pod \"barbican-dfce-account-create-update-mkhgn\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.427656 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b864p\" (UniqueName: \"kubernetes.io/projected/33b3eb96-3132-4a26-bb74-45dc59627711-kube-api-access-b864p\") pod \"barbican-dfce-account-create-update-mkhgn\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.483015 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.686032 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-p9ntc"] Mar 18 13:41:46 crc kubenswrapper[4921]: I0318 13:41:46.977599 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dfce-account-create-update-mkhgn"] Mar 18 13:41:46 crc kubenswrapper[4921]: W0318 13:41:46.980752 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b3eb96_3132_4a26_bb74_45dc59627711.slice/crio-e603b8a85b5c0e9cd9c8348dc049c204e8fed5838532c80a8d89a8e11cdcaa12 WatchSource:0}: Error finding container e603b8a85b5c0e9cd9c8348dc049c204e8fed5838532c80a8d89a8e11cdcaa12: Status 404 returned error can't find the container with id e603b8a85b5c0e9cd9c8348dc049c204e8fed5838532c80a8d89a8e11cdcaa12 Mar 18 13:41:47 crc kubenswrapper[4921]: I0318 13:41:47.277782 4921 generic.go:334] "Generic (PLEG): container finished" podID="5c79fa37-ae65-4e13-a045-1e998aac23ce" containerID="dc48bb584ffeee1db218426789045e8bb44456274eba11a8ad6f76e174417222" exitCode=0 Mar 18 13:41:47 crc kubenswrapper[4921]: I0318 13:41:47.277883 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p9ntc" event={"ID":"5c79fa37-ae65-4e13-a045-1e998aac23ce","Type":"ContainerDied","Data":"dc48bb584ffeee1db218426789045e8bb44456274eba11a8ad6f76e174417222"} Mar 18 13:41:47 crc kubenswrapper[4921]: I0318 13:41:47.277921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p9ntc" event={"ID":"5c79fa37-ae65-4e13-a045-1e998aac23ce","Type":"ContainerStarted","Data":"4253d5ebfdf4165d394a6d5e9199b9cdef3bedd3536809c53b30386d945281de"} Mar 18 13:41:47 crc kubenswrapper[4921]: I0318 13:41:47.282399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dfce-account-create-update-mkhgn" event={"ID":"33b3eb96-3132-4a26-bb74-45dc59627711","Type":"ContainerStarted","Data":"d3859583c35b4254d7afb380a9fb6845b6aaa54f2e3f7f2fdd1e9da7d62b25c2"} Mar 18 13:41:47 crc kubenswrapper[4921]: I0318 13:41:47.282475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dfce-account-create-update-mkhgn" event={"ID":"33b3eb96-3132-4a26-bb74-45dc59627711","Type":"ContainerStarted","Data":"e603b8a85b5c0e9cd9c8348dc049c204e8fed5838532c80a8d89a8e11cdcaa12"} Mar 18 13:41:47 crc kubenswrapper[4921]: I0318 13:41:47.326147 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-dfce-account-create-update-mkhgn" podStartSLOduration=1.326096716 podStartE2EDuration="1.326096716s" podCreationTimestamp="2026-03-18 13:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:41:47.317308935 +0000 UTC m=+5526.867229604" watchObservedRunningTime="2026-03-18 13:41:47.326096716 +0000 UTC m=+5526.876017365" Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.293448 4921 generic.go:334] "Generic (PLEG): container finished" podID="33b3eb96-3132-4a26-bb74-45dc59627711" containerID="d3859583c35b4254d7afb380a9fb6845b6aaa54f2e3f7f2fdd1e9da7d62b25c2" exitCode=0 Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.293556 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dfce-account-create-update-mkhgn" event={"ID":"33b3eb96-3132-4a26-bb74-45dc59627711","Type":"ContainerDied","Data":"d3859583c35b4254d7afb380a9fb6845b6aaa54f2e3f7f2fdd1e9da7d62b25c2"} Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.645252 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.763907 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c79fa37-ae65-4e13-a045-1e998aac23ce-operator-scripts\") pod \"5c79fa37-ae65-4e13-a045-1e998aac23ce\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.764199 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7hhg\" (UniqueName: \"kubernetes.io/projected/5c79fa37-ae65-4e13-a045-1e998aac23ce-kube-api-access-l7hhg\") pod \"5c79fa37-ae65-4e13-a045-1e998aac23ce\" (UID: \"5c79fa37-ae65-4e13-a045-1e998aac23ce\") " Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.764852 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c79fa37-ae65-4e13-a045-1e998aac23ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c79fa37-ae65-4e13-a045-1e998aac23ce" (UID: "5c79fa37-ae65-4e13-a045-1e998aac23ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.772327 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c79fa37-ae65-4e13-a045-1e998aac23ce-kube-api-access-l7hhg" (OuterVolumeSpecName: "kube-api-access-l7hhg") pod "5c79fa37-ae65-4e13-a045-1e998aac23ce" (UID: "5c79fa37-ae65-4e13-a045-1e998aac23ce"). InnerVolumeSpecName "kube-api-access-l7hhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.866507 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7hhg\" (UniqueName: \"kubernetes.io/projected/5c79fa37-ae65-4e13-a045-1e998aac23ce-kube-api-access-l7hhg\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:48 crc kubenswrapper[4921]: I0318 13:41:48.866549 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c79fa37-ae65-4e13-a045-1e998aac23ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.304530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p9ntc" event={"ID":"5c79fa37-ae65-4e13-a045-1e998aac23ce","Type":"ContainerDied","Data":"4253d5ebfdf4165d394a6d5e9199b9cdef3bedd3536809c53b30386d945281de"} Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.306081 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4253d5ebfdf4165d394a6d5e9199b9cdef3bedd3536809c53b30386d945281de" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.304596 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p9ntc" Mar 18 13:41:49 crc kubenswrapper[4921]: E0318 13:41:49.430215 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c79fa37_ae65_4e13_a045_1e998aac23ce.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c79fa37_ae65_4e13_a045_1e998aac23ce.slice/crio-4253d5ebfdf4165d394a6d5e9199b9cdef3bedd3536809c53b30386d945281de\": RecentStats: unable to find data in memory cache]" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.624314 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.786531 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b3eb96-3132-4a26-bb74-45dc59627711-operator-scripts\") pod \"33b3eb96-3132-4a26-bb74-45dc59627711\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.786756 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b864p\" (UniqueName: \"kubernetes.io/projected/33b3eb96-3132-4a26-bb74-45dc59627711-kube-api-access-b864p\") pod \"33b3eb96-3132-4a26-bb74-45dc59627711\" (UID: \"33b3eb96-3132-4a26-bb74-45dc59627711\") " Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.787501 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b3eb96-3132-4a26-bb74-45dc59627711-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33b3eb96-3132-4a26-bb74-45dc59627711" (UID: "33b3eb96-3132-4a26-bb74-45dc59627711"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.788142 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b3eb96-3132-4a26-bb74-45dc59627711-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.791008 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b3eb96-3132-4a26-bb74-45dc59627711-kube-api-access-b864p" (OuterVolumeSpecName: "kube-api-access-b864p") pod "33b3eb96-3132-4a26-bb74-45dc59627711" (UID: "33b3eb96-3132-4a26-bb74-45dc59627711"). InnerVolumeSpecName "kube-api-access-b864p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:41:49 crc kubenswrapper[4921]: I0318 13:41:49.889915 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b864p\" (UniqueName: \"kubernetes.io/projected/33b3eb96-3132-4a26-bb74-45dc59627711-kube-api-access-b864p\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:50 crc kubenswrapper[4921]: I0318 13:41:50.323551 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dfce-account-create-update-mkhgn" event={"ID":"33b3eb96-3132-4a26-bb74-45dc59627711","Type":"ContainerDied","Data":"e603b8a85b5c0e9cd9c8348dc049c204e8fed5838532c80a8d89a8e11cdcaa12"} Mar 18 13:41:50 crc kubenswrapper[4921]: I0318 13:41:50.324257 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e603b8a85b5c0e9cd9c8348dc049c204e8fed5838532c80a8d89a8e11cdcaa12" Mar 18 13:41:50 crc kubenswrapper[4921]: I0318 13:41:50.323628 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dfce-account-create-update-mkhgn" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.463014 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lkbft"] Mar 18 13:41:51 crc kubenswrapper[4921]: E0318 13:41:51.463529 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b3eb96-3132-4a26-bb74-45dc59627711" containerName="mariadb-account-create-update" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.463547 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b3eb96-3132-4a26-bb74-45dc59627711" containerName="mariadb-account-create-update" Mar 18 13:41:51 crc kubenswrapper[4921]: E0318 13:41:51.463563 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c79fa37-ae65-4e13-a045-1e998aac23ce" containerName="mariadb-database-create" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.463570 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c79fa37-ae65-4e13-a045-1e998aac23ce" containerName="mariadb-database-create" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.463763 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c79fa37-ae65-4e13-a045-1e998aac23ce" containerName="mariadb-database-create" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.463781 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b3eb96-3132-4a26-bb74-45dc59627711" containerName="mariadb-account-create-update" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.464575 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.467266 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.467535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-r4qv8" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.478295 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lkbft"] Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.526248 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-combined-ca-bundle\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.526297 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-db-sync-config-data\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.526332 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vss8\" (UniqueName: \"kubernetes.io/projected/ca8fb267-1b96-4722-90ae-94b07acfa50b-kube-api-access-8vss8\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.627603 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-combined-ca-bundle\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.627985 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-db-sync-config-data\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.628039 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vss8\" (UniqueName: \"kubernetes.io/projected/ca8fb267-1b96-4722-90ae-94b07acfa50b-kube-api-access-8vss8\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.635051 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-combined-ca-bundle\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.635524 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-db-sync-config-data\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.652791 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vss8\" (UniqueName: \"kubernetes.io/projected/ca8fb267-1b96-4722-90ae-94b07acfa50b-kube-api-access-8vss8\") pod \"barbican-db-sync-lkbft\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:51 crc kubenswrapper[4921]: I0318 13:41:51.792079 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:52 crc kubenswrapper[4921]: I0318 13:41:52.283062 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lkbft"] Mar 18 13:41:52 crc kubenswrapper[4921]: I0318 13:41:52.343633 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkbft" event={"ID":"ca8fb267-1b96-4722-90ae-94b07acfa50b","Type":"ContainerStarted","Data":"5948e0672e572e78333c9c4b97927a3dbabfb6a309f12a447bb63628cd3792ba"} Mar 18 13:41:53 crc kubenswrapper[4921]: I0318 13:41:53.353983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkbft" event={"ID":"ca8fb267-1b96-4722-90ae-94b07acfa50b","Type":"ContainerStarted","Data":"ac617f6c96fc3defd09220a35624ecd005f20833ab6bf85d916db1783ff2fbd5"} Mar 18 13:41:53 crc kubenswrapper[4921]: I0318 13:41:53.368711 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lkbft" podStartSLOduration=2.368686132 podStartE2EDuration="2.368686132s" podCreationTimestamp="2026-03-18 13:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:41:53.36547097 +0000 UTC m=+5532.915391609" watchObservedRunningTime="2026-03-18 13:41:53.368686132 +0000 UTC m=+5532.918606771" Mar 18 13:41:54 crc kubenswrapper[4921]: I0318 13:41:54.364395 4921 generic.go:334] "Generic (PLEG): container finished" podID="ca8fb267-1b96-4722-90ae-94b07acfa50b" containerID="ac617f6c96fc3defd09220a35624ecd005f20833ab6bf85d916db1783ff2fbd5" exitCode=0 Mar 18 13:41:54 crc kubenswrapper[4921]: I0318 13:41:54.364453 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkbft" event={"ID":"ca8fb267-1b96-4722-90ae-94b07acfa50b","Type":"ContainerDied","Data":"ac617f6c96fc3defd09220a35624ecd005f20833ab6bf85d916db1783ff2fbd5"} Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.772152 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.805522 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-db-sync-config-data\") pod \"ca8fb267-1b96-4722-90ae-94b07acfa50b\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.805579 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vss8\" (UniqueName: \"kubernetes.io/projected/ca8fb267-1b96-4722-90ae-94b07acfa50b-kube-api-access-8vss8\") pod \"ca8fb267-1b96-4722-90ae-94b07acfa50b\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.805624 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-combined-ca-bundle\") pod \"ca8fb267-1b96-4722-90ae-94b07acfa50b\" (UID: \"ca8fb267-1b96-4722-90ae-94b07acfa50b\") " Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.811994 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca8fb267-1b96-4722-90ae-94b07acfa50b" (UID: "ca8fb267-1b96-4722-90ae-94b07acfa50b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.812486 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8fb267-1b96-4722-90ae-94b07acfa50b-kube-api-access-8vss8" (OuterVolumeSpecName: "kube-api-access-8vss8") pod "ca8fb267-1b96-4722-90ae-94b07acfa50b" (UID: "ca8fb267-1b96-4722-90ae-94b07acfa50b"). InnerVolumeSpecName "kube-api-access-8vss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.830455 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca8fb267-1b96-4722-90ae-94b07acfa50b" (UID: "ca8fb267-1b96-4722-90ae-94b07acfa50b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.907896 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.907927 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca8fb267-1b96-4722-90ae-94b07acfa50b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:55.907941 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vss8\" (UniqueName: \"kubernetes.io/projected/ca8fb267-1b96-4722-90ae-94b07acfa50b-kube-api-access-8vss8\") on node \"crc\" DevicePath \"\"" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.387597 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkbft" event={"ID":"ca8fb267-1b96-4722-90ae-94b07acfa50b","Type":"ContainerDied","Data":"5948e0672e572e78333c9c4b97927a3dbabfb6a309f12a447bb63628cd3792ba"} Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.387652 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5948e0672e572e78333c9c4b97927a3dbabfb6a309f12a447bb63628cd3792ba" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.387727 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkbft" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.659038 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-567d546d79-rgpwb"] Mar 18 13:41:56 crc kubenswrapper[4921]: E0318 13:41:56.659601 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8fb267-1b96-4722-90ae-94b07acfa50b" containerName="barbican-db-sync" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.659625 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8fb267-1b96-4722-90ae-94b07acfa50b" containerName="barbican-db-sync" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.659853 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca8fb267-1b96-4722-90ae-94b07acfa50b" containerName="barbican-db-sync" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.660966 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.666850 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.676411 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f5794fb5d-9w9n9"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.678038 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.678532 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-r4qv8" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.679443 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.685173 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.689531 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f5794fb5d-9w9n9"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.709634 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-567d546d79-rgpwb"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.775297 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-2ltxd"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.776854 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.794106 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-2ltxd"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.829780 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99927959-518a-4cbe-8e6c-36e060549d91-logs\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.829870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-config-data-custom\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.829931 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdcgf\" (UniqueName: \"kubernetes.io/projected/e679d5e2-0e15-4b0a-bf61-972499585c0b-kube-api-access-bdcgf\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.829956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-config-data\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.829988 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.830023 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-config-data\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.830074 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvxvr\" (UniqueName: \"kubernetes.io/projected/99927959-518a-4cbe-8e6c-36e060549d91-kube-api-access-pvxvr\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.830152 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-combined-ca-bundle\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.830192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e679d5e2-0e15-4b0a-bf61-972499585c0b-logs\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.830230 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-config-data-custom\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.851953 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f8cd65696-hnlnk"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.853848 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.859339 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.872334 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f8cd65696-hnlnk"] Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.931813 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-combined-ca-bundle\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.931927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e679d5e2-0e15-4b0a-bf61-972499585c0b-logs\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932013 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-config-data-custom\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932046 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-config\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932728 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99927959-518a-4cbe-8e6c-36e060549d91-logs\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932795 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-config-data-custom\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932864 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932923 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdcgf\" (UniqueName: \"kubernetes.io/projected/e679d5e2-0e15-4b0a-bf61-972499585c0b-kube-api-access-bdcgf\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.932953 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-config-data\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.933004 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.933030 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-config-data\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.933068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.933133 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvxvr\" (UniqueName: \"kubernetes.io/projected/99927959-518a-4cbe-8e6c-36e060549d91-kube-api-access-pvxvr\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.933162 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4kdt\" (UniqueName: \"kubernetes.io/projected/42754439-0377-4dfa-b2f6-32843de55d0a-kube-api-access-j4kdt\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.933192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-dns-svc\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.934580 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99927959-518a-4cbe-8e6c-36e060549d91-logs\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.935342 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e679d5e2-0e15-4b0a-bf61-972499585c0b-logs\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.946806 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-config-data\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.949840 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-combined-ca-bundle\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.951965 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-config-data\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.952572 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-combined-ca-bundle\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.952667 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99927959-518a-4cbe-8e6c-36e060549d91-config-data-custom\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.953093 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvxvr\" (UniqueName: \"kubernetes.io/projected/99927959-518a-4cbe-8e6c-36e060549d91-kube-api-access-pvxvr\") pod \"barbican-worker-567d546d79-rgpwb\" (UID: \"99927959-518a-4cbe-8e6c-36e060549d91\") " pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.955798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdcgf\" (UniqueName: \"kubernetes.io/projected/e679d5e2-0e15-4b0a-bf61-972499585c0b-kube-api-access-bdcgf\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.956072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e679d5e2-0e15-4b0a-bf61-972499585c0b-config-data-custom\") pod \"barbican-keystone-listener-5f5794fb5d-9w9n9\" (UID: \"e679d5e2-0e15-4b0a-bf61-972499585c0b\") " pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:56 crc kubenswrapper[4921]: I0318 13:41:56.980191 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-567d546d79-rgpwb" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.002957 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034676 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-config\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034751 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-config-data-custom\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034791 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-combined-ca-bundle\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034810 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvtq\" (UniqueName: \"kubernetes.io/projected/443d9a97-2a7c-4067-adae-3d0a8f914283-kube-api-access-8rvtq\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034860 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034883 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-config-data\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034931 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443d9a97-2a7c-4067-adae-3d0a8f914283-logs\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034953 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.034974 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4kdt\" (UniqueName: \"kubernetes.io/projected/42754439-0377-4dfa-b2f6-32843de55d0a-kube-api-access-j4kdt\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.035000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-dns-svc\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.036187 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-dns-svc\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.036825 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.037040 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-config\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.037082 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.056337 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4kdt\" (UniqueName: \"kubernetes.io/projected/42754439-0377-4dfa-b2f6-32843de55d0a-kube-api-access-j4kdt\") pod \"dnsmasq-dns-869545f9c9-2ltxd\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.095865 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.136198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-config-data\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.136276 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443d9a97-2a7c-4067-adae-3d0a8f914283-logs\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.136342 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-config-data-custom\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.136377 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-combined-ca-bundle\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.136396 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvtq\" (UniqueName: \"kubernetes.io/projected/443d9a97-2a7c-4067-adae-3d0a8f914283-kube-api-access-8rvtq\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.138444 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/443d9a97-2a7c-4067-adae-3d0a8f914283-logs\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.141803 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-combined-ca-bundle\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.149140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-config-data-custom\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.152631 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/443d9a97-2a7c-4067-adae-3d0a8f914283-config-data\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.156721 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvtq\" (UniqueName: \"kubernetes.io/projected/443d9a97-2a7c-4067-adae-3d0a8f914283-kube-api-access-8rvtq\") pod \"barbican-api-5f8cd65696-hnlnk\" (UID: \"443d9a97-2a7c-4067-adae-3d0a8f914283\") " pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.191137 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.470179 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-567d546d79-rgpwb"] Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.567389 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f5794fb5d-9w9n9"] Mar 18 13:41:57 crc kubenswrapper[4921]: W0318 13:41:57.576874 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode679d5e2_0e15_4b0a_bf61_972499585c0b.slice/crio-7d8362d7241f7171fde535a8675b624390bd5037c2c7a4e15d9ffaed39776b23 WatchSource:0}: Error finding container 7d8362d7241f7171fde535a8675b624390bd5037c2c7a4e15d9ffaed39776b23: Status 404 returned error can't find the container with id 7d8362d7241f7171fde535a8675b624390bd5037c2c7a4e15d9ffaed39776b23 Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.684657 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-2ltxd"] Mar 18 13:41:57 crc kubenswrapper[4921]: I0318 13:41:57.694405 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f8cd65696-hnlnk"] Mar 18 13:41:57 crc kubenswrapper[4921]: W0318 13:41:57.699542 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42754439_0377_4dfa_b2f6_32843de55d0a.slice/crio-6993fe96de7fe2f3951156c8e11d822028ae419dd11d71ce070bf8f2d2405502 WatchSource:0}: Error finding container 6993fe96de7fe2f3951156c8e11d822028ae419dd11d71ce070bf8f2d2405502: Status 404 returned error can't find the container with id 6993fe96de7fe2f3951156c8e11d822028ae419dd11d71ce070bf8f2d2405502 Mar 18 13:41:57 crc kubenswrapper[4921]: W0318 13:41:57.702505 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod443d9a97_2a7c_4067_adae_3d0a8f914283.slice/crio-95ed887540e182d9e0085df2f194cf2e8776a30b77075895fd852c2cdaf065aa WatchSource:0}: Error finding container 95ed887540e182d9e0085df2f194cf2e8776a30b77075895fd852c2cdaf065aa: Status 404 returned error can't find the container with id 95ed887540e182d9e0085df2f194cf2e8776a30b77075895fd852c2cdaf065aa Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.439294 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8cd65696-hnlnk" event={"ID":"443d9a97-2a7c-4067-adae-3d0a8f914283","Type":"ContainerStarted","Data":"2b1c3c7137e910f8f1cdd977da6e9972004290125d2454a2556eed698b9390b6"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.439626 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8cd65696-hnlnk" event={"ID":"443d9a97-2a7c-4067-adae-3d0a8f914283","Type":"ContainerStarted","Data":"e99ce9e7c68b8efb5eb0429ef61c8e5d5b5874a88d7714e62d68029b68bf4425"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.439636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f8cd65696-hnlnk" event={"ID":"443d9a97-2a7c-4067-adae-3d0a8f914283","Type":"ContainerStarted","Data":"95ed887540e182d9e0085df2f194cf2e8776a30b77075895fd852c2cdaf065aa"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.440893 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.441202 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.449991 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" event={"ID":"e679d5e2-0e15-4b0a-bf61-972499585c0b","Type":"ContainerStarted","Data":"33b1c70097cd1963efc3ea2caf9b1c5fd10a02fe49f002ffe646c9324a12b50f"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.450050 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" event={"ID":"e679d5e2-0e15-4b0a-bf61-972499585c0b","Type":"ContainerStarted","Data":"7be9848a58e2b1fdaddf69b6ff3249c51ca898727570e0699eefb96a58c626d9"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.450077 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" event={"ID":"e679d5e2-0e15-4b0a-bf61-972499585c0b","Type":"ContainerStarted","Data":"7d8362d7241f7171fde535a8675b624390bd5037c2c7a4e15d9ffaed39776b23"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.476415 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f8cd65696-hnlnk" podStartSLOduration=2.476389829 podStartE2EDuration="2.476389829s" podCreationTimestamp="2026-03-18 13:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:41:58.469716768 +0000 UTC m=+5538.019637417" watchObservedRunningTime="2026-03-18 13:41:58.476389829 +0000 UTC m=+5538.026310478" Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.479202 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-567d546d79-rgpwb" event={"ID":"99927959-518a-4cbe-8e6c-36e060549d91","Type":"ContainerStarted","Data":"eefd1f1c46355386be3f45c0f0ebba77c9810adf85ac522d4b86986051df67ce"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.479274 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-567d546d79-rgpwb" event={"ID":"99927959-518a-4cbe-8e6c-36e060549d91","Type":"ContainerStarted","Data":"9a3d4de77629746256a42e28f03fd0b4a026095bedf2675b49763559b0499c98"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.479292 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-567d546d79-rgpwb" event={"ID":"99927959-518a-4cbe-8e6c-36e060549d91","Type":"ContainerStarted","Data":"b21209e6a10eea4a992a2bb19898a239801221b81a7c09ad9db7980ebe40865f"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.506895 4921 generic.go:334] "Generic (PLEG): container finished" podID="42754439-0377-4dfa-b2f6-32843de55d0a" containerID="508153d08c9bb79926a7c188df478eb25849fcbb4c0a01573ed7f8d01bb9fb3d" exitCode=0 Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.506968 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" event={"ID":"42754439-0377-4dfa-b2f6-32843de55d0a","Type":"ContainerDied","Data":"508153d08c9bb79926a7c188df478eb25849fcbb4c0a01573ed7f8d01bb9fb3d"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.507016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" event={"ID":"42754439-0377-4dfa-b2f6-32843de55d0a","Type":"ContainerStarted","Data":"6993fe96de7fe2f3951156c8e11d822028ae419dd11d71ce070bf8f2d2405502"} Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.523357 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f5794fb5d-9w9n9" podStartSLOduration=2.5233235179999998 podStartE2EDuration="2.523323518s" podCreationTimestamp="2026-03-18 13:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:41:58.517530983 +0000 UTC m=+5538.067451632" watchObservedRunningTime="2026-03-18 13:41:58.523323518 +0000 UTC m=+5538.073244157" Mar 18 13:41:58 crc kubenswrapper[4921]: I0318 13:41:58.617536 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-567d546d79-rgpwb" podStartSLOduration=2.617317402 podStartE2EDuration="2.617317402s" podCreationTimestamp="2026-03-18 13:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:41:58.579212334 +0000 UTC m=+5538.129132983" watchObservedRunningTime="2026-03-18 13:41:58.617317402 +0000 UTC m=+5538.167238041" Mar 18 13:41:59 crc kubenswrapper[4921]: I0318 13:41:59.517351 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" event={"ID":"42754439-0377-4dfa-b2f6-32843de55d0a","Type":"ContainerStarted","Data":"0ab39b001b0e58bd1a2c39824bc72c555b0fa8a0bacb3c420aad6bbfbcbc64f7"} Mar 18 13:41:59 crc kubenswrapper[4921]: I0318 13:41:59.546306 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" podStartSLOduration=3.546281613 podStartE2EDuration="3.546281613s" podCreationTimestamp="2026-03-18 13:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:41:59.543474312 +0000 UTC m=+5539.093394951" watchObservedRunningTime="2026-03-18 13:41:59.546281613 +0000 UTC m=+5539.096202252" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.146085 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564022-mq4z7"] Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.147911 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.151242 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.151338 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.151447 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.160183 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-mq4z7"] Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.209317 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbs4\" (UniqueName: \"kubernetes.io/projected/5eac136f-4ca2-44af-96a1-b20df93c0e01-kube-api-access-5kbs4\") pod \"auto-csr-approver-29564022-mq4z7\" (UID: \"5eac136f-4ca2-44af-96a1-b20df93c0e01\") " pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.311911 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbs4\" (UniqueName: \"kubernetes.io/projected/5eac136f-4ca2-44af-96a1-b20df93c0e01-kube-api-access-5kbs4\") pod \"auto-csr-approver-29564022-mq4z7\" (UID: \"5eac136f-4ca2-44af-96a1-b20df93c0e01\") " pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.352906 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbs4\" (UniqueName: \"kubernetes.io/projected/5eac136f-4ca2-44af-96a1-b20df93c0e01-kube-api-access-5kbs4\") pod \"auto-csr-approver-29564022-mq4z7\" (UID: \"5eac136f-4ca2-44af-96a1-b20df93c0e01\") " pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.481698 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.525672 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:42:00 crc kubenswrapper[4921]: I0318 13:42:00.937435 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-mq4z7"] Mar 18 13:42:00 crc kubenswrapper[4921]: W0318 13:42:00.945161 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eac136f_4ca2_44af_96a1_b20df93c0e01.slice/crio-313e201c8d8c60d583cecf2147d0f11aa37eb681251edc38f1fcdf533474470b WatchSource:0}: Error finding container 313e201c8d8c60d583cecf2147d0f11aa37eb681251edc38f1fcdf533474470b: Status 404 returned error can't find the container with id 313e201c8d8c60d583cecf2147d0f11aa37eb681251edc38f1fcdf533474470b Mar 18 13:42:01 crc kubenswrapper[4921]: I0318 13:42:01.535719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" event={"ID":"5eac136f-4ca2-44af-96a1-b20df93c0e01","Type":"ContainerStarted","Data":"313e201c8d8c60d583cecf2147d0f11aa37eb681251edc38f1fcdf533474470b"} Mar 18 13:42:03 crc kubenswrapper[4921]: I0318 13:42:03.557855 4921 generic.go:334] "Generic (PLEG): container finished" podID="5eac136f-4ca2-44af-96a1-b20df93c0e01" containerID="6dbc05c37c241847136410f87a21b5c86e26194e2516942ad1998f466e1dfd5c" exitCode=0 Mar 18 13:42:03 crc kubenswrapper[4921]: I0318 13:42:03.557973 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" event={"ID":"5eac136f-4ca2-44af-96a1-b20df93c0e01","Type":"ContainerDied","Data":"6dbc05c37c241847136410f87a21b5c86e26194e2516942ad1998f466e1dfd5c"} Mar 18 13:42:03 crc kubenswrapper[4921]: I0318 13:42:03.871594 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:42:04 crc kubenswrapper[4921]: I0318 13:42:04.953894 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.116943 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kbs4\" (UniqueName: \"kubernetes.io/projected/5eac136f-4ca2-44af-96a1-b20df93c0e01-kube-api-access-5kbs4\") pod \"5eac136f-4ca2-44af-96a1-b20df93c0e01\" (UID: \"5eac136f-4ca2-44af-96a1-b20df93c0e01\") " Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.123216 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eac136f-4ca2-44af-96a1-b20df93c0e01-kube-api-access-5kbs4" (OuterVolumeSpecName: "kube-api-access-5kbs4") pod "5eac136f-4ca2-44af-96a1-b20df93c0e01" (UID: "5eac136f-4ca2-44af-96a1-b20df93c0e01"). InnerVolumeSpecName "kube-api-access-5kbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.220335 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kbs4\" (UniqueName: \"kubernetes.io/projected/5eac136f-4ca2-44af-96a1-b20df93c0e01-kube-api-access-5kbs4\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.460432 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f8cd65696-hnlnk" Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.579491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" event={"ID":"5eac136f-4ca2-44af-96a1-b20df93c0e01","Type":"ContainerDied","Data":"313e201c8d8c60d583cecf2147d0f11aa37eb681251edc38f1fcdf533474470b"} Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.579891 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313e201c8d8c60d583cecf2147d0f11aa37eb681251edc38f1fcdf533474470b" Mar 18 13:42:05 crc kubenswrapper[4921]: I0318 13:42:05.579569 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-mq4z7" Mar 18 13:42:06 crc kubenswrapper[4921]: I0318 13:42:06.040865 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-pdmm5"] Mar 18 13:42:06 crc kubenswrapper[4921]: I0318 13:42:06.046767 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-pdmm5"] Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.099317 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.178842 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-j2s62"] Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.179171 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="dnsmasq-dns" containerID="cri-o://6034c34302167f7e6fe51f3e0584d14fcd75468f0ac96ae3cc2e61ee06721561" gracePeriod=10 Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.221186 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16d5b6c-3663-4a39-a43f-e3e055a61a0f" path="/var/lib/kubelet/pods/a16d5b6c-3663-4a39-a43f-e3e055a61a0f/volumes" Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.393867 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.45:5353: connect: connection refused" Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.600275 4921 generic.go:334] "Generic (PLEG): container finished" podID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerID="6034c34302167f7e6fe51f3e0584d14fcd75468f0ac96ae3cc2e61ee06721561" exitCode=0 Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.600330 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" event={"ID":"1964d097-3a65-4efd-bd58-16e101b36d1d","Type":"ContainerDied","Data":"6034c34302167f7e6fe51f3e0584d14fcd75468f0ac96ae3cc2e61ee06721561"} Mar 18 13:42:07 crc kubenswrapper[4921]: I0318 13:42:07.984765 4921 scope.go:117] "RemoveContainer" containerID="999a001bf1cf2790431cf4fa3c63db2079cec4287c15b298e6db0c8dcd50acf8" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.069064 4921 scope.go:117] "RemoveContainer" containerID="82a6b2d8e40bbcae7a4fbb44455e662ef41a9baacd819d4a71b64220b3702b3e" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.322859 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.405693 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-nb\") pod \"1964d097-3a65-4efd-bd58-16e101b36d1d\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.405761 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-config\") pod \"1964d097-3a65-4efd-bd58-16e101b36d1d\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.405848 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-sb\") pod \"1964d097-3a65-4efd-bd58-16e101b36d1d\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.405949 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-dns-svc\") pod \"1964d097-3a65-4efd-bd58-16e101b36d1d\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.406007 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krp6g\" (UniqueName: \"kubernetes.io/projected/1964d097-3a65-4efd-bd58-16e101b36d1d-kube-api-access-krp6g\") pod \"1964d097-3a65-4efd-bd58-16e101b36d1d\" (UID: \"1964d097-3a65-4efd-bd58-16e101b36d1d\") " Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.436414 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1964d097-3a65-4efd-bd58-16e101b36d1d-kube-api-access-krp6g" (OuterVolumeSpecName: "kube-api-access-krp6g") pod "1964d097-3a65-4efd-bd58-16e101b36d1d" (UID: "1964d097-3a65-4efd-bd58-16e101b36d1d"). InnerVolumeSpecName "kube-api-access-krp6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.486341 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1964d097-3a65-4efd-bd58-16e101b36d1d" (UID: "1964d097-3a65-4efd-bd58-16e101b36d1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.499616 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1964d097-3a65-4efd-bd58-16e101b36d1d" (UID: "1964d097-3a65-4efd-bd58-16e101b36d1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.507928 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.507978 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krp6g\" (UniqueName: \"kubernetes.io/projected/1964d097-3a65-4efd-bd58-16e101b36d1d-kube-api-access-krp6g\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.507990 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.518855 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-config" (OuterVolumeSpecName: "config") pod "1964d097-3a65-4efd-bd58-16e101b36d1d" (UID: "1964d097-3a65-4efd-bd58-16e101b36d1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.535822 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1964d097-3a65-4efd-bd58-16e101b36d1d" (UID: "1964d097-3a65-4efd-bd58-16e101b36d1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.609393 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.609421 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1964d097-3a65-4efd-bd58-16e101b36d1d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.613953 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" event={"ID":"1964d097-3a65-4efd-bd58-16e101b36d1d","Type":"ContainerDied","Data":"59e8d35a23902adc56021e266537b333b7e16918f48849f4be4d145f1b873955"} Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.613998 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7485969d9c-j2s62" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.614024 4921 scope.go:117] "RemoveContainer" containerID="6034c34302167f7e6fe51f3e0584d14fcd75468f0ac96ae3cc2e61ee06721561" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.660353 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-j2s62"] Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.663505 4921 scope.go:117] "RemoveContainer" containerID="a20f2c21d44e2fcd810c9a1abe5f39676c6a57c2100d3a42e8ce72e9d09468c4" Mar 18 13:42:08 crc kubenswrapper[4921]: I0318 13:42:08.670406 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7485969d9c-j2s62"] Mar 18 13:42:09 crc kubenswrapper[4921]: I0318 13:42:09.232278 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" path="/var/lib/kubelet/pods/1964d097-3a65-4efd-bd58-16e101b36d1d/volumes" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.499933 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dbvpz"] Mar 18 13:42:20 crc kubenswrapper[4921]: E0318 13:42:20.500872 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="dnsmasq-dns" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.500886 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="dnsmasq-dns" Mar 18 13:42:20 crc kubenswrapper[4921]: E0318 13:42:20.500903 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="init" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.500909 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="init" Mar 18 13:42:20 crc kubenswrapper[4921]: E0318 13:42:20.500918 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eac136f-4ca2-44af-96a1-b20df93c0e01" containerName="oc" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.500924 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eac136f-4ca2-44af-96a1-b20df93c0e01" containerName="oc" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.501087 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eac136f-4ca2-44af-96a1-b20df93c0e01" containerName="oc" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.501121 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1964d097-3a65-4efd-bd58-16e101b36d1d" containerName="dnsmasq-dns" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.501865 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.514046 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dbvpz"] Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.608607 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-aa34-account-create-update-h4nxm"] Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.610005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.612222 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.620954 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-aa34-account-create-update-h4nxm"] Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.676488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97qnb\" (UniqueName: \"kubernetes.io/projected/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-kube-api-access-97qnb\") pod \"neutron-aa34-account-create-update-h4nxm\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.676613 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9360306-7a86-4b44-8284-9e5f9df08df6-operator-scripts\") pod \"neutron-db-create-dbvpz\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.676639 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-operator-scripts\") pod \"neutron-aa34-account-create-update-h4nxm\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.676696 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whdg5\" (UniqueName: \"kubernetes.io/projected/e9360306-7a86-4b44-8284-9e5f9df08df6-kube-api-access-whdg5\") pod \"neutron-db-create-dbvpz\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.778646 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9360306-7a86-4b44-8284-9e5f9df08df6-operator-scripts\") pod \"neutron-db-create-dbvpz\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.778704 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-operator-scripts\") pod \"neutron-aa34-account-create-update-h4nxm\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.778782 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whdg5\" (UniqueName: \"kubernetes.io/projected/e9360306-7a86-4b44-8284-9e5f9df08df6-kube-api-access-whdg5\") pod \"neutron-db-create-dbvpz\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.778887 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97qnb\" (UniqueName: \"kubernetes.io/projected/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-kube-api-access-97qnb\") pod \"neutron-aa34-account-create-update-h4nxm\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.779576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9360306-7a86-4b44-8284-9e5f9df08df6-operator-scripts\") pod \"neutron-db-create-dbvpz\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.779780 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-operator-scripts\") pod \"neutron-aa34-account-create-update-h4nxm\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.807329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97qnb\" (UniqueName: \"kubernetes.io/projected/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-kube-api-access-97qnb\") pod \"neutron-aa34-account-create-update-h4nxm\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.809738 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whdg5\" (UniqueName: \"kubernetes.io/projected/e9360306-7a86-4b44-8284-9e5f9df08df6-kube-api-access-whdg5\") pod \"neutron-db-create-dbvpz\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.821549 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:20 crc kubenswrapper[4921]: I0318 13:42:20.937659 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.298563 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dbvpz"] Mar 18 13:42:21 crc kubenswrapper[4921]: W0318 13:42:21.460221 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec6a3cc_7b39_40f4_9977_8d0f12e44ea9.slice/crio-d4cb8598fcc54f106c362d1dd7acc70546e66f0e422af97b4d6211e6feffb36f WatchSource:0}: Error finding container d4cb8598fcc54f106c362d1dd7acc70546e66f0e422af97b4d6211e6feffb36f: Status 404 returned error can't find the container with id d4cb8598fcc54f106c362d1dd7acc70546e66f0e422af97b4d6211e6feffb36f Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.460274 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-aa34-account-create-update-h4nxm"] Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.732846 4921 generic.go:334] "Generic (PLEG): container finished" podID="e9360306-7a86-4b44-8284-9e5f9df08df6" containerID="e5f87ed0a043e3c96c25cca2f00f679dde038e65bbd9bbb43a4ceacf6f3ae8df" exitCode=0 Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.732936 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbvpz" event={"ID":"e9360306-7a86-4b44-8284-9e5f9df08df6","Type":"ContainerDied","Data":"e5f87ed0a043e3c96c25cca2f00f679dde038e65bbd9bbb43a4ceacf6f3ae8df"} Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.732974 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbvpz" event={"ID":"e9360306-7a86-4b44-8284-9e5f9df08df6","Type":"ContainerStarted","Data":"c318abd7e80ea0c8294f72ac02b4950a38084f49c6135f49bd6dd05e1e5716d2"} Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.735371 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aa34-account-create-update-h4nxm" event={"ID":"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9","Type":"ContainerStarted","Data":"44879c9cc26ff2973864295bab118d591f421619c856cd9e6cc1f7278e4a8bc4"} Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.735397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aa34-account-create-update-h4nxm" event={"ID":"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9","Type":"ContainerStarted","Data":"d4cb8598fcc54f106c362d1dd7acc70546e66f0e422af97b4d6211e6feffb36f"} Mar 18 13:42:21 crc kubenswrapper[4921]: I0318 13:42:21.762220 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-aa34-account-create-update-h4nxm" podStartSLOduration=1.762193661 podStartE2EDuration="1.762193661s" podCreationTimestamp="2026-03-18 13:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:42:21.761073989 +0000 UTC m=+5561.310994638" watchObservedRunningTime="2026-03-18 13:42:21.762193661 +0000 UTC m=+5561.312114300" Mar 18 13:42:22 crc kubenswrapper[4921]: I0318 13:42:22.756154 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" containerID="44879c9cc26ff2973864295bab118d591f421619c856cd9e6cc1f7278e4a8bc4" exitCode=0 Mar 18 13:42:22 crc kubenswrapper[4921]: I0318 13:42:22.756827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aa34-account-create-update-h4nxm" event={"ID":"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9","Type":"ContainerDied","Data":"44879c9cc26ff2973864295bab118d591f421619c856cd9e6cc1f7278e4a8bc4"} Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.129247 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.230009 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whdg5\" (UniqueName: \"kubernetes.io/projected/e9360306-7a86-4b44-8284-9e5f9df08df6-kube-api-access-whdg5\") pod \"e9360306-7a86-4b44-8284-9e5f9df08df6\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.230196 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9360306-7a86-4b44-8284-9e5f9df08df6-operator-scripts\") pod \"e9360306-7a86-4b44-8284-9e5f9df08df6\" (UID: \"e9360306-7a86-4b44-8284-9e5f9df08df6\") " Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.231336 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9360306-7a86-4b44-8284-9e5f9df08df6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9360306-7a86-4b44-8284-9e5f9df08df6" (UID: "e9360306-7a86-4b44-8284-9e5f9df08df6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.237323 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9360306-7a86-4b44-8284-9e5f9df08df6-kube-api-access-whdg5" (OuterVolumeSpecName: "kube-api-access-whdg5") pod "e9360306-7a86-4b44-8284-9e5f9df08df6" (UID: "e9360306-7a86-4b44-8284-9e5f9df08df6"). InnerVolumeSpecName "kube-api-access-whdg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.332855 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whdg5\" (UniqueName: \"kubernetes.io/projected/e9360306-7a86-4b44-8284-9e5f9df08df6-kube-api-access-whdg5\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.332916 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9360306-7a86-4b44-8284-9e5f9df08df6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.767320 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dbvpz" Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.767333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dbvpz" event={"ID":"e9360306-7a86-4b44-8284-9e5f9df08df6","Type":"ContainerDied","Data":"c318abd7e80ea0c8294f72ac02b4950a38084f49c6135f49bd6dd05e1e5716d2"} Mar 18 13:42:23 crc kubenswrapper[4921]: I0318 13:42:23.771759 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c318abd7e80ea0c8294f72ac02b4950a38084f49c6135f49bd6dd05e1e5716d2" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.129604 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.248620 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-operator-scripts\") pod \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.249068 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97qnb\" (UniqueName: \"kubernetes.io/projected/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-kube-api-access-97qnb\") pod \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\" (UID: \"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9\") " Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.249917 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" (UID: "1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.254170 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-kube-api-access-97qnb" (OuterVolumeSpecName: "kube-api-access-97qnb") pod "1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" (UID: "1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9"). InnerVolumeSpecName "kube-api-access-97qnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.352370 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.352414 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97qnb\" (UniqueName: \"kubernetes.io/projected/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9-kube-api-access-97qnb\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.777797 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aa34-account-create-update-h4nxm" event={"ID":"1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9","Type":"ContainerDied","Data":"d4cb8598fcc54f106c362d1dd7acc70546e66f0e422af97b4d6211e6feffb36f"} Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.778754 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4cb8598fcc54f106c362d1dd7acc70546e66f0e422af97b4d6211e6feffb36f" Mar 18 13:42:24 crc kubenswrapper[4921]: I0318 13:42:24.777834 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aa34-account-create-update-h4nxm" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.816784 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rc4qq"] Mar 18 13:42:25 crc kubenswrapper[4921]: E0318 13:42:25.817637 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" containerName="mariadb-account-create-update" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.817659 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" containerName="mariadb-account-create-update" Mar 18 13:42:25 crc kubenswrapper[4921]: E0318 13:42:25.817675 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9360306-7a86-4b44-8284-9e5f9df08df6" containerName="mariadb-database-create" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.817682 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9360306-7a86-4b44-8284-9e5f9df08df6" containerName="mariadb-database-create" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.817948 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" containerName="mariadb-account-create-update" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.817967 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9360306-7a86-4b44-8284-9e5f9df08df6" containerName="mariadb-database-create" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.819293 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.821846 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.822270 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ctm4k" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.822375 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.840960 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rc4qq"] Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.987541 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-combined-ca-bundle\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.987840 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-config\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:25 crc kubenswrapper[4921]: I0318 13:42:25.987947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27mr\" (UniqueName: \"kubernetes.io/projected/79fc5201-6049-4c9c-8e89-b6282f01a708-kube-api-access-t27mr\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.090278 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-combined-ca-bundle\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.090348 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-config\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.090381 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27mr\" (UniqueName: \"kubernetes.io/projected/79fc5201-6049-4c9c-8e89-b6282f01a708-kube-api-access-t27mr\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.098582 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-config\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.098661 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-combined-ca-bundle\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.115813 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27mr\" (UniqueName: \"kubernetes.io/projected/79fc5201-6049-4c9c-8e89-b6282f01a708-kube-api-access-t27mr\") pod \"neutron-db-sync-rc4qq\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.184307 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.648083 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rc4qq"] Mar 18 13:42:26 crc kubenswrapper[4921]: I0318 13:42:26.795234 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rc4qq" event={"ID":"79fc5201-6049-4c9c-8e89-b6282f01a708","Type":"ContainerStarted","Data":"90785a764741e8ce7f1456b8cf012e867499707566bbbaf00abddf4dfbe579fc"} Mar 18 13:42:27 crc kubenswrapper[4921]: I0318 13:42:27.805570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rc4qq" event={"ID":"79fc5201-6049-4c9c-8e89-b6282f01a708","Type":"ContainerStarted","Data":"941b4171b6f2f6047e5707ce8db88ed30f91dcf9b095ba38f6f31015fe597182"} Mar 18 13:42:27 crc kubenswrapper[4921]: I0318 13:42:27.829750 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rc4qq" podStartSLOduration=2.8297253209999997 podStartE2EDuration="2.829725321s" podCreationTimestamp="2026-03-18 13:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:42:27.821080214 +0000 UTC m=+5567.371000853" watchObservedRunningTime="2026-03-18 13:42:27.829725321 +0000 UTC m=+5567.379645960" Mar 18 13:42:32 crc kubenswrapper[4921]: I0318 13:42:32.852251 4921 generic.go:334] "Generic (PLEG): container finished" podID="79fc5201-6049-4c9c-8e89-b6282f01a708" containerID="941b4171b6f2f6047e5707ce8db88ed30f91dcf9b095ba38f6f31015fe597182" exitCode=0 Mar 18 13:42:32 crc kubenswrapper[4921]: I0318 13:42:32.852644 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rc4qq" event={"ID":"79fc5201-6049-4c9c-8e89-b6282f01a708","Type":"ContainerDied","Data":"941b4171b6f2f6047e5707ce8db88ed30f91dcf9b095ba38f6f31015fe597182"} Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.245296 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.364023 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27mr\" (UniqueName: \"kubernetes.io/projected/79fc5201-6049-4c9c-8e89-b6282f01a708-kube-api-access-t27mr\") pod \"79fc5201-6049-4c9c-8e89-b6282f01a708\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.364149 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-config\") pod \"79fc5201-6049-4c9c-8e89-b6282f01a708\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.364228 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-combined-ca-bundle\") pod \"79fc5201-6049-4c9c-8e89-b6282f01a708\" (UID: \"79fc5201-6049-4c9c-8e89-b6282f01a708\") " Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.377503 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fc5201-6049-4c9c-8e89-b6282f01a708-kube-api-access-t27mr" (OuterVolumeSpecName: "kube-api-access-t27mr") pod "79fc5201-6049-4c9c-8e89-b6282f01a708" (UID: "79fc5201-6049-4c9c-8e89-b6282f01a708"). InnerVolumeSpecName "kube-api-access-t27mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.391441 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-config" (OuterVolumeSpecName: "config") pod "79fc5201-6049-4c9c-8e89-b6282f01a708" (UID: "79fc5201-6049-4c9c-8e89-b6282f01a708"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.393209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79fc5201-6049-4c9c-8e89-b6282f01a708" (UID: "79fc5201-6049-4c9c-8e89-b6282f01a708"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.467204 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.467251 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27mr\" (UniqueName: \"kubernetes.io/projected/79fc5201-6049-4c9c-8e89-b6282f01a708-kube-api-access-t27mr\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.467262 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/79fc5201-6049-4c9c-8e89-b6282f01a708-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.880087 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rc4qq" event={"ID":"79fc5201-6049-4c9c-8e89-b6282f01a708","Type":"ContainerDied","Data":"90785a764741e8ce7f1456b8cf012e867499707566bbbaf00abddf4dfbe579fc"} Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.880617 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90785a764741e8ce7f1456b8cf012e867499707566bbbaf00abddf4dfbe579fc" Mar 18 13:42:34 crc kubenswrapper[4921]: I0318 13:42:34.880185 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rc4qq" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.035335 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-m2j24"] Mar 18 13:42:35 crc kubenswrapper[4921]: E0318 13:42:35.035840 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fc5201-6049-4c9c-8e89-b6282f01a708" containerName="neutron-db-sync" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.035861 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fc5201-6049-4c9c-8e89-b6282f01a708" containerName="neutron-db-sync" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.036072 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fc5201-6049-4c9c-8e89-b6282f01a708" containerName="neutron-db-sync" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.037301 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.061527 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-m2j24"] Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.139568 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5cc84b8dcc-z4hnz"] Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.141019 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.144632 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.145098 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ctm4k" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.145223 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.172028 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cc84b8dcc-z4hnz"] Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.184241 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmsqd\" (UniqueName: \"kubernetes.io/projected/930f175c-5b60-4f69-9e66-0be30d7d987b-kube-api-access-fmsqd\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.184314 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-nb\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.184508 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-config\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.184631 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-dns-svc\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.184658 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-sb\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-config\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286489 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk65b\" (UniqueName: \"kubernetes.io/projected/e064cef7-7755-4e07-87e3-52b941cd9ead-kube-api-access-hk65b\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286538 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-dns-svc\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286558 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-sb\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286616 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-combined-ca-bundle\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-httpd-config\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmsqd\" (UniqueName: \"kubernetes.io/projected/930f175c-5b60-4f69-9e66-0be30d7d987b-kube-api-access-fmsqd\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286704 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-nb\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.286737 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-config\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.287717 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-config\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.288788 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-dns-svc\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.290420 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-sb\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.291212 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-nb\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.311945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmsqd\" (UniqueName: \"kubernetes.io/projected/930f175c-5b60-4f69-9e66-0be30d7d987b-kube-api-access-fmsqd\") pod \"dnsmasq-dns-94d77d5bf-m2j24\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.361306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.389558 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-combined-ca-bundle\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.389638 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-httpd-config\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.389699 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-config\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.389777 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk65b\" (UniqueName: \"kubernetes.io/projected/e064cef7-7755-4e07-87e3-52b941cd9ead-kube-api-access-hk65b\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.397395 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-config\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.398215 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-httpd-config\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.409460 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk65b\" (UniqueName: \"kubernetes.io/projected/e064cef7-7755-4e07-87e3-52b941cd9ead-kube-api-access-hk65b\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.413329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e064cef7-7755-4e07-87e3-52b941cd9ead-combined-ca-bundle\") pod \"neutron-5cc84b8dcc-z4hnz\" (UID: \"e064cef7-7755-4e07-87e3-52b941cd9ead\") " pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.462361 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:35 crc kubenswrapper[4921]: I0318 13:42:35.894096 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-m2j24"] Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.177432 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cc84b8dcc-z4hnz"] Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.901591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cc84b8dcc-z4hnz" event={"ID":"e064cef7-7755-4e07-87e3-52b941cd9ead","Type":"ContainerStarted","Data":"8936900e8285bfc59e7c6a1124b3d2c1292533e2e8e64678c8e8b511436aaf98"} Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.901910 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cc84b8dcc-z4hnz" event={"ID":"e064cef7-7755-4e07-87e3-52b941cd9ead","Type":"ContainerStarted","Data":"72cd534e1b1821d12c9f6778ee32e5ca80ca7f3604f944fbb703eae5e64450f1"} Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.901925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cc84b8dcc-z4hnz" event={"ID":"e064cef7-7755-4e07-87e3-52b941cd9ead","Type":"ContainerStarted","Data":"a8cb7a52a41a6c828432139a50d536024def1a8d779877339a92aa5d5d80705d"} Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.902451 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.905013 4921 generic.go:334] "Generic (PLEG): container finished" podID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerID="3fcb61daf21eb6206c886f672522ba3348a06e34706b9cde6a4c37373cbfb407" exitCode=0 Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.905066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" event={"ID":"930f175c-5b60-4f69-9e66-0be30d7d987b","Type":"ContainerDied","Data":"3fcb61daf21eb6206c886f672522ba3348a06e34706b9cde6a4c37373cbfb407"} Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.905097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" event={"ID":"930f175c-5b60-4f69-9e66-0be30d7d987b","Type":"ContainerStarted","Data":"bca2a6bcb01494334fd585a06ff87533f53f8ef9a00c8e1c4877d6d84fb68373"} Mar 18 13:42:36 crc kubenswrapper[4921]: I0318 13:42:36.943316 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5cc84b8dcc-z4hnz" podStartSLOduration=1.9432905379999998 podStartE2EDuration="1.943290538s" podCreationTimestamp="2026-03-18 13:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:42:36.918194042 +0000 UTC m=+5576.468114701" watchObservedRunningTime="2026-03-18 13:42:36.943290538 +0000 UTC m=+5576.493211187" Mar 18 13:42:37 crc kubenswrapper[4921]: I0318 13:42:37.919628 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" event={"ID":"930f175c-5b60-4f69-9e66-0be30d7d987b","Type":"ContainerStarted","Data":"4f418bb058c35edd6a9445ffc565522883b1da78889e7df91473691168d0f8e2"} Mar 18 13:42:37 crc kubenswrapper[4921]: I0318 13:42:37.919864 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:37 crc kubenswrapper[4921]: I0318 13:42:37.952086 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" podStartSLOduration=2.952059407 podStartE2EDuration="2.952059407s" podCreationTimestamp="2026-03-18 13:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:42:37.940892869 +0000 UTC m=+5577.490813508" watchObservedRunningTime="2026-03-18 13:42:37.952059407 +0000 UTC m=+5577.501980056" Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.363201 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.444799 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-2ltxd"] Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.445606 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" containerName="dnsmasq-dns" containerID="cri-o://0ab39b001b0e58bd1a2c39824bc72c555b0fa8a0bacb3c420aad6bbfbcbc64f7" gracePeriod=10 Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.992897 4921 generic.go:334] "Generic (PLEG): container finished" podID="42754439-0377-4dfa-b2f6-32843de55d0a" containerID="0ab39b001b0e58bd1a2c39824bc72c555b0fa8a0bacb3c420aad6bbfbcbc64f7" exitCode=0 Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.993221 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" event={"ID":"42754439-0377-4dfa-b2f6-32843de55d0a","Type":"ContainerDied","Data":"0ab39b001b0e58bd1a2c39824bc72c555b0fa8a0bacb3c420aad6bbfbcbc64f7"} Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.993297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" event={"ID":"42754439-0377-4dfa-b2f6-32843de55d0a","Type":"ContainerDied","Data":"6993fe96de7fe2f3951156c8e11d822028ae419dd11d71ce070bf8f2d2405502"} Mar 18 13:42:45 crc kubenswrapper[4921]: I0318 13:42:45.993320 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6993fe96de7fe2f3951156c8e11d822028ae419dd11d71ce070bf8f2d2405502" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.004751 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.148897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-sb\") pod \"42754439-0377-4dfa-b2f6-32843de55d0a\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.149015 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-dns-svc\") pod \"42754439-0377-4dfa-b2f6-32843de55d0a\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.149138 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4kdt\" (UniqueName: \"kubernetes.io/projected/42754439-0377-4dfa-b2f6-32843de55d0a-kube-api-access-j4kdt\") pod \"42754439-0377-4dfa-b2f6-32843de55d0a\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.149233 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-nb\") pod \"42754439-0377-4dfa-b2f6-32843de55d0a\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.149324 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-config\") pod \"42754439-0377-4dfa-b2f6-32843de55d0a\" (UID: \"42754439-0377-4dfa-b2f6-32843de55d0a\") " Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.155803 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42754439-0377-4dfa-b2f6-32843de55d0a-kube-api-access-j4kdt" (OuterVolumeSpecName: "kube-api-access-j4kdt") pod "42754439-0377-4dfa-b2f6-32843de55d0a" (UID: "42754439-0377-4dfa-b2f6-32843de55d0a"). InnerVolumeSpecName "kube-api-access-j4kdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.196276 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42754439-0377-4dfa-b2f6-32843de55d0a" (UID: "42754439-0377-4dfa-b2f6-32843de55d0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.198067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42754439-0377-4dfa-b2f6-32843de55d0a" (UID: "42754439-0377-4dfa-b2f6-32843de55d0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.199222 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-config" (OuterVolumeSpecName: "config") pod "42754439-0377-4dfa-b2f6-32843de55d0a" (UID: "42754439-0377-4dfa-b2f6-32843de55d0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.200576 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42754439-0377-4dfa-b2f6-32843de55d0a" (UID: "42754439-0377-4dfa-b2f6-32843de55d0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.251953 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.252001 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.252016 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.252029 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4kdt\" (UniqueName: \"kubernetes.io/projected/42754439-0377-4dfa-b2f6-32843de55d0a-kube-api-access-j4kdt\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:46 crc kubenswrapper[4921]: I0318 13:42:46.252046 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42754439-0377-4dfa-b2f6-32843de55d0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:47 crc kubenswrapper[4921]: I0318 13:42:47.001364 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869545f9c9-2ltxd" Mar 18 13:42:47 crc kubenswrapper[4921]: I0318 13:42:47.051652 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-2ltxd"] Mar 18 13:42:47 crc kubenswrapper[4921]: I0318 13:42:47.061496 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869545f9c9-2ltxd"] Mar 18 13:42:47 crc kubenswrapper[4921]: I0318 13:42:47.222504 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" path="/var/lib/kubelet/pods/42754439-0377-4dfa-b2f6-32843de55d0a/volumes" Mar 18 13:43:05 crc kubenswrapper[4921]: I0318 13:43:05.478390 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5cc84b8dcc-z4hnz" Mar 18 13:43:12 crc kubenswrapper[4921]: I0318 13:43:12.959480 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-49xjt"] Mar 18 13:43:12 crc kubenswrapper[4921]: E0318 13:43:12.960620 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" containerName="dnsmasq-dns" Mar 18 13:43:12 crc kubenswrapper[4921]: I0318 13:43:12.960637 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" containerName="dnsmasq-dns" Mar 18 13:43:12 crc kubenswrapper[4921]: E0318 13:43:12.960669 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" containerName="init" Mar 18 13:43:12 crc kubenswrapper[4921]: I0318 13:43:12.960676 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" containerName="init" Mar 18 13:43:12 crc kubenswrapper[4921]: I0318 13:43:12.960859 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="42754439-0377-4dfa-b2f6-32843de55d0a" containerName="dnsmasq-dns" Mar 18 13:43:12 crc kubenswrapper[4921]: I0318 13:43:12.961611 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-49xjt" Mar 18 13:43:12 crc kubenswrapper[4921]: I0318 13:43:12.975100 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-49xjt"] Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.053305 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-071c-account-create-update-286tt"] Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.055057 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.058054 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.060130 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzh97\" (UniqueName: \"kubernetes.io/projected/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-kube-api-access-bzh97\") pod \"glance-db-create-49xjt\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.060294 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-operator-scripts\") pod \"glance-db-create-49xjt\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.064309 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-071c-account-create-update-286tt"] Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.161657 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvcbs\" (UniqueName: \"kubernetes.io/projected/4971dbff-763d-416b-8582-9eda46f0baeb-kube-api-access-pvcbs\") pod \"glance-071c-account-create-update-286tt\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.161721 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4971dbff-763d-416b-8582-9eda46f0baeb-operator-scripts\") pod \"glance-071c-account-create-update-286tt\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.161792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-operator-scripts\") pod \"glance-db-create-49xjt\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.161875 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzh97\" (UniqueName: \"kubernetes.io/projected/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-kube-api-access-bzh97\") pod \"glance-db-create-49xjt\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.162704 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-operator-scripts\") pod \"glance-db-create-49xjt\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.181235 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzh97\" (UniqueName: \"kubernetes.io/projected/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-kube-api-access-bzh97\") pod \"glance-db-create-49xjt\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.263160 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvcbs\" (UniqueName: \"kubernetes.io/projected/4971dbff-763d-416b-8582-9eda46f0baeb-kube-api-access-pvcbs\") pod \"glance-071c-account-create-update-286tt\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.263467 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4971dbff-763d-416b-8582-9eda46f0baeb-operator-scripts\") pod \"glance-071c-account-create-update-286tt\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.264355 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4971dbff-763d-416b-8582-9eda46f0baeb-operator-scripts\") pod \"glance-071c-account-create-update-286tt\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.282967 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvcbs\" (UniqueName: \"kubernetes.io/projected/4971dbff-763d-416b-8582-9eda46f0baeb-kube-api-access-pvcbs\") pod \"glance-071c-account-create-update-286tt\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.286933 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-49xjt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.371817 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.852388 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-071c-account-create-update-286tt"] Mar 18 13:43:13 crc kubenswrapper[4921]: I0318 13:43:13.902558 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-49xjt"] Mar 18 13:43:13 crc kubenswrapper[4921]: W0318 13:43:13.903747 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9625f730_5f70_4a22_a0ed_cffc4bbd0ced.slice/crio-750dd872b8ba7f39b0717410e40a725869a87214c81ae9db703d19b771245405 WatchSource:0}: Error finding container 750dd872b8ba7f39b0717410e40a725869a87214c81ae9db703d19b771245405: Status 404 returned error can't find the container with id 750dd872b8ba7f39b0717410e40a725869a87214c81ae9db703d19b771245405 Mar 18 13:43:14 crc kubenswrapper[4921]: I0318 13:43:14.253733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-071c-account-create-update-286tt" event={"ID":"4971dbff-763d-416b-8582-9eda46f0baeb","Type":"ContainerStarted","Data":"a0530887b7045ec4642bb8ed83eee2562c06233d0bd83e2d7ff375f137eee036"} Mar 18 13:43:14 crc kubenswrapper[4921]: I0318 13:43:14.253780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-071c-account-create-update-286tt" event={"ID":"4971dbff-763d-416b-8582-9eda46f0baeb","Type":"ContainerStarted","Data":"a4aa60ee9268bedb7977b0d93ef40305d061021eef1574668ed015aba670c336"} Mar 18 13:43:14 crc kubenswrapper[4921]: I0318 13:43:14.255280 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-49xjt" event={"ID":"9625f730-5f70-4a22-a0ed-cffc4bbd0ced","Type":"ContainerStarted","Data":"0718ca0d26de2c05a90e0117cabaebba552eb1e052bd89eb788ca16e756cf116"} Mar 18 13:43:14 crc kubenswrapper[4921]: I0318 13:43:14.255319 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-49xjt" event={"ID":"9625f730-5f70-4a22-a0ed-cffc4bbd0ced","Type":"ContainerStarted","Data":"750dd872b8ba7f39b0717410e40a725869a87214c81ae9db703d19b771245405"} Mar 18 13:43:14 crc kubenswrapper[4921]: I0318 13:43:14.269220 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-071c-account-create-update-286tt" podStartSLOduration=1.269188503 podStartE2EDuration="1.269188503s" podCreationTimestamp="2026-03-18 13:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:14.268559615 +0000 UTC m=+5613.818480264" watchObservedRunningTime="2026-03-18 13:43:14.269188503 +0000 UTC m=+5613.819109142" Mar 18 13:43:14 crc kubenswrapper[4921]: I0318 13:43:14.284863 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-49xjt" podStartSLOduration=2.28484027 podStartE2EDuration="2.28484027s" podCreationTimestamp="2026-03-18 13:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:14.283061899 +0000 UTC m=+5613.832982558" watchObservedRunningTime="2026-03-18 13:43:14.28484027 +0000 UTC m=+5613.834760909" Mar 18 13:43:15 crc kubenswrapper[4921]: I0318 13:43:15.266882 4921 generic.go:334] "Generic (PLEG): container finished" podID="9625f730-5f70-4a22-a0ed-cffc4bbd0ced" containerID="0718ca0d26de2c05a90e0117cabaebba552eb1e052bd89eb788ca16e756cf116" exitCode=0 Mar 18 13:43:15 crc kubenswrapper[4921]: I0318 13:43:15.266990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-49xjt" event={"ID":"9625f730-5f70-4a22-a0ed-cffc4bbd0ced","Type":"ContainerDied","Data":"0718ca0d26de2c05a90e0117cabaebba552eb1e052bd89eb788ca16e756cf116"} Mar 18 13:43:15 crc kubenswrapper[4921]: I0318 13:43:15.271808 4921 generic.go:334] "Generic (PLEG): container finished" podID="4971dbff-763d-416b-8582-9eda46f0baeb" containerID="a0530887b7045ec4642bb8ed83eee2562c06233d0bd83e2d7ff375f137eee036" exitCode=0 Mar 18 13:43:15 crc kubenswrapper[4921]: I0318 13:43:15.271864 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-071c-account-create-update-286tt" event={"ID":"4971dbff-763d-416b-8582-9eda46f0baeb","Type":"ContainerDied","Data":"a0530887b7045ec4642bb8ed83eee2562c06233d0bd83e2d7ff375f137eee036"} Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.694456 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-49xjt" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.701352 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.743864 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-operator-scripts\") pod \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.743989 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvcbs\" (UniqueName: \"kubernetes.io/projected/4971dbff-763d-416b-8582-9eda46f0baeb-kube-api-access-pvcbs\") pod \"4971dbff-763d-416b-8582-9eda46f0baeb\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.744059 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzh97\" (UniqueName: \"kubernetes.io/projected/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-kube-api-access-bzh97\") pod \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\" (UID: \"9625f730-5f70-4a22-a0ed-cffc4bbd0ced\") " Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.744177 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4971dbff-763d-416b-8582-9eda46f0baeb-operator-scripts\") pod \"4971dbff-763d-416b-8582-9eda46f0baeb\" (UID: \"4971dbff-763d-416b-8582-9eda46f0baeb\") " Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.746489 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9625f730-5f70-4a22-a0ed-cffc4bbd0ced" (UID: "9625f730-5f70-4a22-a0ed-cffc4bbd0ced"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.746554 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4971dbff-763d-416b-8582-9eda46f0baeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4971dbff-763d-416b-8582-9eda46f0baeb" (UID: "4971dbff-763d-416b-8582-9eda46f0baeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.752020 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-kube-api-access-bzh97" (OuterVolumeSpecName: "kube-api-access-bzh97") pod "9625f730-5f70-4a22-a0ed-cffc4bbd0ced" (UID: "9625f730-5f70-4a22-a0ed-cffc4bbd0ced"). InnerVolumeSpecName "kube-api-access-bzh97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.761921 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4971dbff-763d-416b-8582-9eda46f0baeb-kube-api-access-pvcbs" (OuterVolumeSpecName: "kube-api-access-pvcbs") pod "4971dbff-763d-416b-8582-9eda46f0baeb" (UID: "4971dbff-763d-416b-8582-9eda46f0baeb"). InnerVolumeSpecName "kube-api-access-pvcbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.846266 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4971dbff-763d-416b-8582-9eda46f0baeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.846308 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.846322 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvcbs\" (UniqueName: \"kubernetes.io/projected/4971dbff-763d-416b-8582-9eda46f0baeb-kube-api-access-pvcbs\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:16 crc kubenswrapper[4921]: I0318 13:43:16.846336 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzh97\" (UniqueName: \"kubernetes.io/projected/9625f730-5f70-4a22-a0ed-cffc4bbd0ced-kube-api-access-bzh97\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.080861 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.080937 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.301893 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-49xjt" event={"ID":"9625f730-5f70-4a22-a0ed-cffc4bbd0ced","Type":"ContainerDied","Data":"750dd872b8ba7f39b0717410e40a725869a87214c81ae9db703d19b771245405"} Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.301941 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750dd872b8ba7f39b0717410e40a725869a87214c81ae9db703d19b771245405" Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.301957 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-49xjt" Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.307373 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-071c-account-create-update-286tt" event={"ID":"4971dbff-763d-416b-8582-9eda46f0baeb","Type":"ContainerDied","Data":"a4aa60ee9268bedb7977b0d93ef40305d061021eef1574668ed015aba670c336"} Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.307418 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4aa60ee9268bedb7977b0d93ef40305d061021eef1574668ed015aba670c336" Mar 18 13:43:17 crc kubenswrapper[4921]: I0318 13:43:17.307497 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-071c-account-create-update-286tt" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.305682 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-f92d5"] Mar 18 13:43:18 crc kubenswrapper[4921]: E0318 13:43:18.308869 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9625f730-5f70-4a22-a0ed-cffc4bbd0ced" containerName="mariadb-database-create" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.308893 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9625f730-5f70-4a22-a0ed-cffc4bbd0ced" containerName="mariadb-database-create" Mar 18 13:43:18 crc kubenswrapper[4921]: E0318 13:43:18.308957 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4971dbff-763d-416b-8582-9eda46f0baeb" containerName="mariadb-account-create-update" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.308972 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4971dbff-763d-416b-8582-9eda46f0baeb" containerName="mariadb-account-create-update" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.309322 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9625f730-5f70-4a22-a0ed-cffc4bbd0ced" containerName="mariadb-database-create" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.309338 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4971dbff-763d-416b-8582-9eda46f0baeb" containerName="mariadb-account-create-update" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.310458 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.313402 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g2xns" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.314576 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.330063 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-f92d5"] Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.376618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-combined-ca-bundle\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.376687 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86dhq\" (UniqueName: \"kubernetes.io/projected/919236cf-190f-434e-ad68-aea95427a765-kube-api-access-86dhq\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.376747 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-config-data\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.376775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-db-sync-config-data\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.479608 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-combined-ca-bundle\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.479691 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86dhq\" (UniqueName: \"kubernetes.io/projected/919236cf-190f-434e-ad68-aea95427a765-kube-api-access-86dhq\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.479750 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-config-data\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.479784 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-db-sync-config-data\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.487567 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-config-data\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.489451 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-combined-ca-bundle\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.489595 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-db-sync-config-data\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.504185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86dhq\" (UniqueName: \"kubernetes.io/projected/919236cf-190f-434e-ad68-aea95427a765-kube-api-access-86dhq\") pod \"glance-db-sync-f92d5\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:18 crc kubenswrapper[4921]: I0318 13:43:18.627695 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:19 crc kubenswrapper[4921]: I0318 13:43:19.219963 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-f92d5"] Mar 18 13:43:19 crc kubenswrapper[4921]: I0318 13:43:19.329828 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f92d5" event={"ID":"919236cf-190f-434e-ad68-aea95427a765","Type":"ContainerStarted","Data":"31947d2faa548dc6af0f127f9c4e159c070813cc0f4101be365aa723b80ac2b1"} Mar 18 13:43:20 crc kubenswrapper[4921]: I0318 13:43:20.340231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f92d5" event={"ID":"919236cf-190f-434e-ad68-aea95427a765","Type":"ContainerStarted","Data":"02aa7b07daca54c57e9e50efeb5e7759d15368ba06efab11df2076e0875971d5"} Mar 18 13:43:20 crc kubenswrapper[4921]: I0318 13:43:20.359189 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-f92d5" podStartSLOduration=2.359169223 podStartE2EDuration="2.359169223s" podCreationTimestamp="2026-03-18 13:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:20.355188569 +0000 UTC m=+5619.905109228" watchObservedRunningTime="2026-03-18 13:43:20.359169223 +0000 UTC m=+5619.909089862" Mar 18 13:43:23 crc kubenswrapper[4921]: I0318 13:43:23.368853 4921 generic.go:334] "Generic (PLEG): container finished" podID="919236cf-190f-434e-ad68-aea95427a765" containerID="02aa7b07daca54c57e9e50efeb5e7759d15368ba06efab11df2076e0875971d5" exitCode=0 Mar 18 13:43:23 crc kubenswrapper[4921]: I0318 13:43:23.368940 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f92d5" event={"ID":"919236cf-190f-434e-ad68-aea95427a765","Type":"ContainerDied","Data":"02aa7b07daca54c57e9e50efeb5e7759d15368ba06efab11df2076e0875971d5"} Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.808865 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.908342 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86dhq\" (UniqueName: \"kubernetes.io/projected/919236cf-190f-434e-ad68-aea95427a765-kube-api-access-86dhq\") pod \"919236cf-190f-434e-ad68-aea95427a765\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.908482 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-combined-ca-bundle\") pod \"919236cf-190f-434e-ad68-aea95427a765\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.908664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-config-data\") pod \"919236cf-190f-434e-ad68-aea95427a765\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.908790 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-db-sync-config-data\") pod \"919236cf-190f-434e-ad68-aea95427a765\" (UID: \"919236cf-190f-434e-ad68-aea95427a765\") " Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.914300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "919236cf-190f-434e-ad68-aea95427a765" (UID: "919236cf-190f-434e-ad68-aea95427a765"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.932596 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919236cf-190f-434e-ad68-aea95427a765-kube-api-access-86dhq" (OuterVolumeSpecName: "kube-api-access-86dhq") pod "919236cf-190f-434e-ad68-aea95427a765" (UID: "919236cf-190f-434e-ad68-aea95427a765"). InnerVolumeSpecName "kube-api-access-86dhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.943435 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "919236cf-190f-434e-ad68-aea95427a765" (UID: "919236cf-190f-434e-ad68-aea95427a765"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:24 crc kubenswrapper[4921]: I0318 13:43:24.967937 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-config-data" (OuterVolumeSpecName: "config-data") pod "919236cf-190f-434e-ad68-aea95427a765" (UID: "919236cf-190f-434e-ad68-aea95427a765"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.011067 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86dhq\" (UniqueName: \"kubernetes.io/projected/919236cf-190f-434e-ad68-aea95427a765-kube-api-access-86dhq\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.011119 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.011132 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.011146 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/919236cf-190f-434e-ad68-aea95427a765-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.389508 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-f92d5" event={"ID":"919236cf-190f-434e-ad68-aea95427a765","Type":"ContainerDied","Data":"31947d2faa548dc6af0f127f9c4e159c070813cc0f4101be365aa723b80ac2b1"} Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.389858 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31947d2faa548dc6af0f127f9c4e159c070813cc0f4101be365aa723b80ac2b1" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.389585 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-f92d5" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.670552 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:25 crc kubenswrapper[4921]: E0318 13:43:25.671053 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919236cf-190f-434e-ad68-aea95427a765" containerName="glance-db-sync" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.671076 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="919236cf-190f-434e-ad68-aea95427a765" containerName="glance-db-sync" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.671373 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="919236cf-190f-434e-ad68-aea95427a765" containerName="glance-db-sync" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.676557 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.679192 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g2xns" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.679281 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.680827 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.682627 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.686311 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.737958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.738032 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w5pl\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-kube-api-access-6w5pl\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.738057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-config-data\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.738105 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.738154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-ceph\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.738191 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-scripts\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.738210 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-logs\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.817827 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-gxnbj"] Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.819665 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840353 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-ceph\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-scripts\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-logs\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840603 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w5pl\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-kube-api-access-6w5pl\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840629 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-config-data\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.840820 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.841090 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-logs\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.862638 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-ceph\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.864357 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-gxnbj"] Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.879778 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-config-data\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.882064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w5pl\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-kube-api-access-6w5pl\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.884520 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.886707 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-scripts\") pod \"glance-default-external-api-0\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.931780 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.942474 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mpx\" (UniqueName: \"kubernetes.io/projected/7dcdc436-edb5-44cd-97c5-08e25953972b-kube-api-access-k9mpx\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.942531 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-dns-svc\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.942617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-sb\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.942721 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-nb\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.942742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-config\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.967124 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.985314 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:25 crc kubenswrapper[4921]: I0318 13:43:25.987852 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.000338 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044706 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mpx\" (UniqueName: \"kubernetes.io/projected/7dcdc436-edb5-44cd-97c5-08e25953972b-kube-api-access-k9mpx\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044764 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-dns-svc\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044814 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044849 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-sb\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044865 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-logs\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044885 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044905 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044937 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.044965 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkc56\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-kube-api-access-dkc56\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.045011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.045041 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-nb\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.045065 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-config\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.046031 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-config\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.047047 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-nb\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.047155 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-dns-svc\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.047484 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-sb\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.068349 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mpx\" (UniqueName: \"kubernetes.io/projected/7dcdc436-edb5-44cd-97c5-08e25953972b-kube-api-access-k9mpx\") pod \"dnsmasq-dns-8565f7649c-gxnbj\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.144469 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.147448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.147539 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-logs\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.147571 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.147607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.148269 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.148338 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkc56\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-kube-api-access-dkc56\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.148403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.148427 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-logs\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.148499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.151126 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.153546 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.155050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.155099 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.198063 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkc56\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-kube-api-access-dkc56\") pod \"glance-default-internal-api-0\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.305718 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.707702 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-gxnbj"] Mar 18 13:43:26 crc kubenswrapper[4921]: I0318 13:43:26.758433 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:27 crc kubenswrapper[4921]: W0318 13:43:27.005337 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda030bf26_a447_4eae_b0c7_e035b0479939.slice/crio-4fe228966cc14c554e65e560397537cc0bb2971ddf430cf1466cae0ead96b40a WatchSource:0}: Error finding container 4fe228966cc14c554e65e560397537cc0bb2971ddf430cf1466cae0ead96b40a: Status 404 returned error can't find the container with id 4fe228966cc14c554e65e560397537cc0bb2971ddf430cf1466cae0ead96b40a Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.006741 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.265157 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.414841 4921 generic.go:334] "Generic (PLEG): container finished" podID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerID="10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f" exitCode=0 Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.414984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" event={"ID":"7dcdc436-edb5-44cd-97c5-08e25953972b","Type":"ContainerDied","Data":"10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f"} Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.415016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" event={"ID":"7dcdc436-edb5-44cd-97c5-08e25953972b","Type":"ContainerStarted","Data":"60f82e4dbed34628c531d8f3911a3d5a469c0048135869d0181928b715561b1a"} Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.427359 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6559531d-576f-47d3-aecb-13b3b5818429","Type":"ContainerStarted","Data":"11144dd3396abf70e80a2a28eb9254307e73d9909fbc6b1503ee02e0f1c055ab"} Mar 18 13:43:27 crc kubenswrapper[4921]: I0318 13:43:27.431100 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a030bf26-a447-4eae-b0c7-e035b0479939","Type":"ContainerStarted","Data":"4fe228966cc14c554e65e560397537cc0bb2971ddf430cf1466cae0ead96b40a"} Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.445439 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" event={"ID":"7dcdc436-edb5-44cd-97c5-08e25953972b","Type":"ContainerStarted","Data":"d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb"} Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.446036 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.447800 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6559531d-576f-47d3-aecb-13b3b5818429","Type":"ContainerStarted","Data":"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f"} Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.447852 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6559531d-576f-47d3-aecb-13b3b5818429","Type":"ContainerStarted","Data":"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0"} Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.447925 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-httpd" containerID="cri-o://9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f" gracePeriod=30 Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.447894 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-log" containerID="cri-o://aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0" gracePeriod=30 Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.450220 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a030bf26-a447-4eae-b0c7-e035b0479939","Type":"ContainerStarted","Data":"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402"} Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.450368 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a030bf26-a447-4eae-b0c7-e035b0479939","Type":"ContainerStarted","Data":"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533"} Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.471494 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" podStartSLOduration=3.471469117 podStartE2EDuration="3.471469117s" podCreationTimestamp="2026-03-18 13:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:28.467212805 +0000 UTC m=+5628.017133444" watchObservedRunningTime="2026-03-18 13:43:28.471469117 +0000 UTC m=+5628.021389756" Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.489427 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.489409369 podStartE2EDuration="3.489409369s" podCreationTimestamp="2026-03-18 13:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:28.48595158 +0000 UTC m=+5628.035872219" watchObservedRunningTime="2026-03-18 13:43:28.489409369 +0000 UTC m=+5628.039330008" Mar 18 13:43:28 crc kubenswrapper[4921]: I0318 13:43:28.515438 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.515410051 podStartE2EDuration="3.515410051s" podCreationTimestamp="2026-03-18 13:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:28.507251928 +0000 UTC m=+5628.057172577" watchObservedRunningTime="2026-03-18 13:43:28.515410051 +0000 UTC m=+5628.065330690" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.122043 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209460 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-combined-ca-bundle\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209526 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w5pl\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-kube-api-access-6w5pl\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209580 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-scripts\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209601 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-httpd-run\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209675 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-logs\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209714 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-ceph\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.209764 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-config-data\") pod \"6559531d-576f-47d3-aecb-13b3b5818429\" (UID: \"6559531d-576f-47d3-aecb-13b3b5818429\") " Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.211067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.212180 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-logs" (OuterVolumeSpecName: "logs") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.218099 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-scripts" (OuterVolumeSpecName: "scripts") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.227436 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-kube-api-access-6w5pl" (OuterVolumeSpecName: "kube-api-access-6w5pl") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "kube-api-access-6w5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.234507 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-ceph" (OuterVolumeSpecName: "ceph") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.245967 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.261510 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-config-data" (OuterVolumeSpecName: "config-data") pod "6559531d-576f-47d3-aecb-13b3b5818429" (UID: "6559531d-576f-47d3-aecb-13b3b5818429"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312523 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312566 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w5pl\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-kube-api-access-6w5pl\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312581 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312591 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312602 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6559531d-576f-47d3-aecb-13b3b5818429-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312610 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6559531d-576f-47d3-aecb-13b3b5818429-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.312619 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6559531d-576f-47d3-aecb-13b3b5818429-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.460898 4921 generic.go:334] "Generic (PLEG): container finished" podID="6559531d-576f-47d3-aecb-13b3b5818429" containerID="9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f" exitCode=0 Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.460943 4921 generic.go:334] "Generic (PLEG): container finished" podID="6559531d-576f-47d3-aecb-13b3b5818429" containerID="aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0" exitCode=143 Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.461913 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.466325 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6559531d-576f-47d3-aecb-13b3b5818429","Type":"ContainerDied","Data":"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f"} Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.466707 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6559531d-576f-47d3-aecb-13b3b5818429","Type":"ContainerDied","Data":"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0"} Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.466728 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6559531d-576f-47d3-aecb-13b3b5818429","Type":"ContainerDied","Data":"11144dd3396abf70e80a2a28eb9254307e73d9909fbc6b1503ee02e0f1c055ab"} Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.466747 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.466776 4921 scope.go:117] "RemoveContainer" containerID="9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.504600 4921 scope.go:117] "RemoveContainer" containerID="aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.509186 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.519789 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.524594 4921 scope.go:117] "RemoveContainer" containerID="9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f" Mar 18 13:43:29 crc kubenswrapper[4921]: E0318 13:43:29.524948 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f\": container with ID starting with 9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f not found: ID does not exist" containerID="9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.524976 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f"} err="failed to get container status \"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f\": rpc error: code = NotFound desc = could not find container \"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f\": container with ID starting with 9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f not found: ID does not exist" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.524996 4921 scope.go:117] "RemoveContainer" containerID="aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0" Mar 18 13:43:29 crc kubenswrapper[4921]: E0318 13:43:29.525288 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0\": container with ID starting with aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0 not found: ID does not exist" containerID="aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.525311 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0"} err="failed to get container status \"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0\": rpc error: code = NotFound desc = could not find container \"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0\": container with ID starting with aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0 not found: ID does not exist" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.525326 4921 scope.go:117] "RemoveContainer" containerID="9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.525487 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f"} err="failed to get container status \"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f\": rpc error: code = NotFound desc = could not find container \"9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f\": container with ID starting with 9e76cb95a2eb430c90090ef2fdb27493ed3fd631b5accec9dea063efebc60d5f not found: ID does not exist" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.525505 4921 scope.go:117] "RemoveContainer" containerID="aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.525668 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0"} err="failed to get container status \"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0\": rpc error: code = NotFound desc = could not find container \"aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0\": container with ID starting with aa6b57a793f25711821c1360e33a6e9a699b1803dd1f96aa2a4a089da0988fe0 not found: ID does not exist" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.537935 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:29 crc kubenswrapper[4921]: E0318 13:43:29.538346 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-httpd" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.538366 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-httpd" Mar 18 13:43:29 crc kubenswrapper[4921]: E0318 13:43:29.538393 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-log" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.538401 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-log" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.538586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-log" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.538604 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6559531d-576f-47d3-aecb-13b3b5818429" containerName="glance-httpd" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.539595 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.548160 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.556754 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618574 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618646 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv7m9\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-kube-api-access-tv7m9\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618675 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618818 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618843 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618864 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-ceph\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.618880 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-logs\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720312 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv7m9\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-kube-api-access-tv7m9\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720367 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720395 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720413 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-logs\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-ceph\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.720540 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.721786 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.721888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-logs\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.726881 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-ceph\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.727325 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.727756 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.728469 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.741963 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv7m9\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-kube-api-access-tv7m9\") pod \"glance-default-external-api-0\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " pod="openstack/glance-default-external-api-0" Mar 18 13:43:29 crc kubenswrapper[4921]: I0318 13:43:29.855374 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:43:30 crc kubenswrapper[4921]: I0318 13:43:30.472508 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-log" containerID="cri-o://37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533" gracePeriod=30 Mar 18 13:43:30 crc kubenswrapper[4921]: I0318 13:43:30.472601 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-httpd" containerID="cri-o://83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402" gracePeriod=30 Mar 18 13:43:30 crc kubenswrapper[4921]: I0318 13:43:30.528002 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:43:30 crc kubenswrapper[4921]: W0318 13:43:30.540338 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b1eff78_0677_46af_aa7d_6426e107ca86.slice/crio-2bc6933ae93eedcd664aa389df462e212b027f10c50ad6d0c9c1ce5d83032267 WatchSource:0}: Error finding container 2bc6933ae93eedcd664aa389df462e212b027f10c50ad6d0c9c1ce5d83032267: Status 404 returned error can't find the container with id 2bc6933ae93eedcd664aa389df462e212b027f10c50ad6d0c9c1ce5d83032267 Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.169240 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.220311 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6559531d-576f-47d3-aecb-13b3b5818429" path="/var/lib/kubelet/pods/6559531d-576f-47d3-aecb-13b3b5818429/volumes" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkc56\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-kube-api-access-dkc56\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252405 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-combined-ca-bundle\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252469 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-config-data\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252633 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-ceph\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252827 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-httpd-run\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252850 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-logs\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.252870 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-scripts\") pod \"a030bf26-a447-4eae-b0c7-e035b0479939\" (UID: \"a030bf26-a447-4eae-b0c7-e035b0479939\") " Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.253832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-logs" (OuterVolumeSpecName: "logs") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.254214 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.256903 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-scripts" (OuterVolumeSpecName: "scripts") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.257086 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-kube-api-access-dkc56" (OuterVolumeSpecName: "kube-api-access-dkc56") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "kube-api-access-dkc56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.263209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-ceph" (OuterVolumeSpecName: "ceph") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.283246 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.323698 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-config-data" (OuterVolumeSpecName: "config-data") pod "a030bf26-a447-4eae-b0c7-e035b0479939" (UID: "a030bf26-a447-4eae-b0c7-e035b0479939"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361098 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361149 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361161 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a030bf26-a447-4eae-b0c7-e035b0479939-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361169 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361179 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkc56\" (UniqueName: \"kubernetes.io/projected/a030bf26-a447-4eae-b0c7-e035b0479939-kube-api-access-dkc56\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361190 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.361199 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a030bf26-a447-4eae-b0c7-e035b0479939-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.485291 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b1eff78-0677-46af-aa7d-6426e107ca86","Type":"ContainerStarted","Data":"871ef9689fa23ce411276d8d8f20ff4f9d6d4c2aafc27282632e5e5cbdbb6692"} Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.485349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b1eff78-0677-46af-aa7d-6426e107ca86","Type":"ContainerStarted","Data":"2bc6933ae93eedcd664aa389df462e212b027f10c50ad6d0c9c1ce5d83032267"} Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487279 4921 generic.go:334] "Generic (PLEG): container finished" podID="a030bf26-a447-4eae-b0c7-e035b0479939" containerID="83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402" exitCode=0 Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487318 4921 generic.go:334] "Generic (PLEG): container finished" podID="a030bf26-a447-4eae-b0c7-e035b0479939" containerID="37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533" exitCode=143 Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a030bf26-a447-4eae-b0c7-e035b0479939","Type":"ContainerDied","Data":"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402"} Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487385 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a030bf26-a447-4eae-b0c7-e035b0479939","Type":"ContainerDied","Data":"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533"} Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487408 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a030bf26-a447-4eae-b0c7-e035b0479939","Type":"ContainerDied","Data":"4fe228966cc14c554e65e560397537cc0bb2971ddf430cf1466cae0ead96b40a"} Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487439 4921 scope.go:117] "RemoveContainer" containerID="83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.487634 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.528267 4921 scope.go:117] "RemoveContainer" containerID="37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.538043 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.561690 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.582527 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:31 crc kubenswrapper[4921]: E0318 13:43:31.583394 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-httpd" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.583413 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-httpd" Mar 18 13:43:31 crc kubenswrapper[4921]: E0318 13:43:31.583444 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-log" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.583451 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-log" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.583814 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-log" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.583856 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" containerName="glance-httpd" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.585301 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.587633 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.588465 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.595486 4921 scope.go:117] "RemoveContainer" containerID="83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402" Mar 18 13:43:31 crc kubenswrapper[4921]: E0318 13:43:31.595954 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402\": container with ID starting with 83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402 not found: ID does not exist" containerID="83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.596024 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402"} err="failed to get container status \"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402\": rpc error: code = NotFound desc = could not find container \"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402\": container with ID starting with 83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402 not found: ID does not exist" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.596060 4921 scope.go:117] "RemoveContainer" containerID="37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533" Mar 18 13:43:31 crc kubenswrapper[4921]: E0318 13:43:31.596690 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533\": container with ID starting with 37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533 not found: ID does not exist" containerID="37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.596719 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533"} err="failed to get container status \"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533\": rpc error: code = NotFound desc = could not find container \"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533\": container with ID starting with 37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533 not found: ID does not exist" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.596740 4921 scope.go:117] "RemoveContainer" containerID="83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.597182 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402"} err="failed to get container status \"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402\": rpc error: code = NotFound desc = could not find container \"83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402\": container with ID starting with 83a8535c296c661dfbd1e26201e23935ba5757bf41c57ca0f59ff004b2bbc402 not found: ID does not exist" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.597207 4921 scope.go:117] "RemoveContainer" containerID="37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.599246 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533"} err="failed to get container status \"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533\": rpc error: code = NotFound desc = could not find container \"37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533\": container with ID starting with 37541617aad49bd889477eebf5264622c0b8a0d33916fd7f9f4b63f90eaaf533 not found: ID does not exist" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.671582 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.671824 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.671958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.672020 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.672064 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scm88\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-kube-api-access-scm88\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.672236 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.672324 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.773997 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scm88\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-kube-api-access-scm88\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774361 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774415 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774451 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774550 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774589 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774618 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.774905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.775259 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.779875 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.779973 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.780675 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.789494 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.791569 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scm88\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-kube-api-access-scm88\") pod \"glance-default-internal-api-0\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:43:31 crc kubenswrapper[4921]: I0318 13:43:31.918554 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:32 crc kubenswrapper[4921]: I0318 13:43:32.486619 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:43:32 crc kubenswrapper[4921]: W0318 13:43:32.491393 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ee9259b_c033_4560_a793_ee436c2bf5b8.slice/crio-5f1413b7181d5c023702ce414a05cfb13f8fb6bd8866bd807ff752ac3dd75f1a WatchSource:0}: Error finding container 5f1413b7181d5c023702ce414a05cfb13f8fb6bd8866bd807ff752ac3dd75f1a: Status 404 returned error can't find the container with id 5f1413b7181d5c023702ce414a05cfb13f8fb6bd8866bd807ff752ac3dd75f1a Mar 18 13:43:32 crc kubenswrapper[4921]: I0318 13:43:32.500183 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b1eff78-0677-46af-aa7d-6426e107ca86","Type":"ContainerStarted","Data":"3c577fa552ae3a4ed50a0e2913690b973ceab307e19a7f14017ba56effc08b15"} Mar 18 13:43:32 crc kubenswrapper[4921]: I0318 13:43:32.544736 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.544714672 podStartE2EDuration="3.544714672s" podCreationTimestamp="2026-03-18 13:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:32.533589594 +0000 UTC m=+5632.083510233" watchObservedRunningTime="2026-03-18 13:43:32.544714672 +0000 UTC m=+5632.094635311" Mar 18 13:43:33 crc kubenswrapper[4921]: I0318 13:43:33.221412 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a030bf26-a447-4eae-b0c7-e035b0479939" path="/var/lib/kubelet/pods/a030bf26-a447-4eae-b0c7-e035b0479939/volumes" Mar 18 13:43:33 crc kubenswrapper[4921]: I0318 13:43:33.527188 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ee9259b-c033-4560-a793-ee436c2bf5b8","Type":"ContainerStarted","Data":"6ded7fcb9b153cbbdab97d24eb0a4079e14521cd7c324d07b4893694f02a2246"} Mar 18 13:43:33 crc kubenswrapper[4921]: I0318 13:43:33.527517 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ee9259b-c033-4560-a793-ee436c2bf5b8","Type":"ContainerStarted","Data":"bfc018df9d6e84012230f326f54a78bc538ed65aa99536cd0dd2ebd1b2b84ed7"} Mar 18 13:43:33 crc kubenswrapper[4921]: I0318 13:43:33.527531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ee9259b-c033-4560-a793-ee436c2bf5b8","Type":"ContainerStarted","Data":"5f1413b7181d5c023702ce414a05cfb13f8fb6bd8866bd807ff752ac3dd75f1a"} Mar 18 13:43:33 crc kubenswrapper[4921]: I0318 13:43:33.566890 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.566869612 podStartE2EDuration="2.566869612s" podCreationTimestamp="2026-03-18 13:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:33.559569694 +0000 UTC m=+5633.109490333" watchObservedRunningTime="2026-03-18 13:43:33.566869612 +0000 UTC m=+5633.116790251" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.146826 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.217898 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-m2j24"] Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.218180 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerName="dnsmasq-dns" containerID="cri-o://4f418bb058c35edd6a9445ffc565522883b1da78889e7df91473691168d0f8e2" gracePeriod=10 Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.556245 4921 generic.go:334] "Generic (PLEG): container finished" podID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerID="4f418bb058c35edd6a9445ffc565522883b1da78889e7df91473691168d0f8e2" exitCode=0 Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.556559 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" event={"ID":"930f175c-5b60-4f69-9e66-0be30d7d987b","Type":"ContainerDied","Data":"4f418bb058c35edd6a9445ffc565522883b1da78889e7df91473691168d0f8e2"} Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.694906 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.874451 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmsqd\" (UniqueName: \"kubernetes.io/projected/930f175c-5b60-4f69-9e66-0be30d7d987b-kube-api-access-fmsqd\") pod \"930f175c-5b60-4f69-9e66-0be30d7d987b\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.874590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-nb\") pod \"930f175c-5b60-4f69-9e66-0be30d7d987b\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.874676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-dns-svc\") pod \"930f175c-5b60-4f69-9e66-0be30d7d987b\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.874719 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-sb\") pod \"930f175c-5b60-4f69-9e66-0be30d7d987b\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.874815 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-config\") pod \"930f175c-5b60-4f69-9e66-0be30d7d987b\" (UID: \"930f175c-5b60-4f69-9e66-0be30d7d987b\") " Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.900031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930f175c-5b60-4f69-9e66-0be30d7d987b-kube-api-access-fmsqd" (OuterVolumeSpecName: "kube-api-access-fmsqd") pod "930f175c-5b60-4f69-9e66-0be30d7d987b" (UID: "930f175c-5b60-4f69-9e66-0be30d7d987b"). InnerVolumeSpecName "kube-api-access-fmsqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.924272 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-config" (OuterVolumeSpecName: "config") pod "930f175c-5b60-4f69-9e66-0be30d7d987b" (UID: "930f175c-5b60-4f69-9e66-0be30d7d987b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.929587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "930f175c-5b60-4f69-9e66-0be30d7d987b" (UID: "930f175c-5b60-4f69-9e66-0be30d7d987b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.931263 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "930f175c-5b60-4f69-9e66-0be30d7d987b" (UID: "930f175c-5b60-4f69-9e66-0be30d7d987b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.934624 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "930f175c-5b60-4f69-9e66-0be30d7d987b" (UID: "930f175c-5b60-4f69-9e66-0be30d7d987b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.977002 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmsqd\" (UniqueName: \"kubernetes.io/projected/930f175c-5b60-4f69-9e66-0be30d7d987b-kube-api-access-fmsqd\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.977047 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.977068 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.977081 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:36 crc kubenswrapper[4921]: I0318 13:43:36.977093 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/930f175c-5b60-4f69-9e66-0be30d7d987b-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:37 crc kubenswrapper[4921]: I0318 13:43:37.566703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" event={"ID":"930f175c-5b60-4f69-9e66-0be30d7d987b","Type":"ContainerDied","Data":"bca2a6bcb01494334fd585a06ff87533f53f8ef9a00c8e1c4877d6d84fb68373"} Mar 18 13:43:37 crc kubenswrapper[4921]: I0318 13:43:37.566770 4921 scope.go:117] "RemoveContainer" containerID="4f418bb058c35edd6a9445ffc565522883b1da78889e7df91473691168d0f8e2" Mar 18 13:43:37 crc kubenswrapper[4921]: I0318 13:43:37.567003 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94d77d5bf-m2j24" Mar 18 13:43:37 crc kubenswrapper[4921]: I0318 13:43:37.602444 4921 scope.go:117] "RemoveContainer" containerID="3fcb61daf21eb6206c886f672522ba3348a06e34706b9cde6a4c37373cbfb407" Mar 18 13:43:37 crc kubenswrapper[4921]: I0318 13:43:37.613857 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-m2j24"] Mar 18 13:43:37 crc kubenswrapper[4921]: I0318 13:43:37.621822 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94d77d5bf-m2j24"] Mar 18 13:43:39 crc kubenswrapper[4921]: I0318 13:43:39.218749 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" path="/var/lib/kubelet/pods/930f175c-5b60-4f69-9e66-0be30d7d987b/volumes" Mar 18 13:43:39 crc kubenswrapper[4921]: I0318 13:43:39.855870 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:43:39 crc kubenswrapper[4921]: I0318 13:43:39.856277 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:43:39 crc kubenswrapper[4921]: I0318 13:43:39.883155 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:43:39 crc kubenswrapper[4921]: I0318 13:43:39.911798 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:43:40 crc kubenswrapper[4921]: I0318 13:43:40.600246 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:43:40 crc kubenswrapper[4921]: I0318 13:43:40.600295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:43:41 crc kubenswrapper[4921]: I0318 13:43:41.919538 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:41 crc kubenswrapper[4921]: I0318 13:43:41.919936 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:41 crc kubenswrapper[4921]: I0318 13:43:41.999626 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:42 crc kubenswrapper[4921]: I0318 13:43:42.048714 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:42 crc kubenswrapper[4921]: I0318 13:43:42.618328 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:42 crc kubenswrapper[4921]: I0318 13:43:42.618365 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:43 crc kubenswrapper[4921]: I0318 13:43:43.057914 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:43:43 crc kubenswrapper[4921]: I0318 13:43:43.058722 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:43:43 crc kubenswrapper[4921]: I0318 13:43:43.059097 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:43:44 crc kubenswrapper[4921]: I0318 13:43:44.632610 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:43:44 crc kubenswrapper[4921]: I0318 13:43:44.633151 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:43:44 crc kubenswrapper[4921]: I0318 13:43:44.816758 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:44 crc kubenswrapper[4921]: I0318 13:43:44.821648 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:43:47 crc kubenswrapper[4921]: I0318 13:43:47.081386 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:43:47 crc kubenswrapper[4921]: I0318 13:43:47.081764 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.259830 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-68l5k"] Mar 18 13:43:51 crc kubenswrapper[4921]: E0318 13:43:51.260643 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerName="dnsmasq-dns" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.260659 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerName="dnsmasq-dns" Mar 18 13:43:51 crc kubenswrapper[4921]: E0318 13:43:51.260679 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerName="init" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.260684 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerName="init" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.260892 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="930f175c-5b60-4f69-9e66-0be30d7d987b" containerName="dnsmasq-dns" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.261509 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-053c-account-create-update-hg5sk"] Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.262077 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.262197 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-68l5k"] Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.262248 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.268408 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.274202 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-053c-account-create-update-hg5sk"] Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.356512 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-operator-scripts\") pod \"placement-db-create-68l5k\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.356557 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-operator-scripts\") pod \"placement-053c-account-create-update-hg5sk\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.356653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27k59\" (UniqueName: \"kubernetes.io/projected/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-kube-api-access-27k59\") pod \"placement-db-create-68l5k\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.356759 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzfm\" (UniqueName: \"kubernetes.io/projected/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-kube-api-access-2dzfm\") pod \"placement-053c-account-create-update-hg5sk\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.458439 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27k59\" (UniqueName: \"kubernetes.io/projected/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-kube-api-access-27k59\") pod \"placement-db-create-68l5k\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.458625 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzfm\" (UniqueName: \"kubernetes.io/projected/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-kube-api-access-2dzfm\") pod \"placement-053c-account-create-update-hg5sk\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.458728 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-operator-scripts\") pod \"placement-db-create-68l5k\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.458773 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-operator-scripts\") pod \"placement-053c-account-create-update-hg5sk\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.459748 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-operator-scripts\") pod \"placement-053c-account-create-update-hg5sk\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.459783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-operator-scripts\") pod \"placement-db-create-68l5k\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.483283 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27k59\" (UniqueName: \"kubernetes.io/projected/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-kube-api-access-27k59\") pod \"placement-db-create-68l5k\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.487824 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzfm\" (UniqueName: \"kubernetes.io/projected/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-kube-api-access-2dzfm\") pod \"placement-053c-account-create-update-hg5sk\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.589459 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-68l5k" Mar 18 13:43:51 crc kubenswrapper[4921]: I0318 13:43:51.599879 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.104958 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-68l5k"] Mar 18 13:43:52 crc kubenswrapper[4921]: W0318 13:43:52.108314 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4754cb8_9b80_4f83_a7eb_16895f1b3dee.slice/crio-e7e78c0586c3f170d72846bb687dd75e6ccb46adda794b519e1a8b031d9bacc9 WatchSource:0}: Error finding container e7e78c0586c3f170d72846bb687dd75e6ccb46adda794b519e1a8b031d9bacc9: Status 404 returned error can't find the container with id e7e78c0586c3f170d72846bb687dd75e6ccb46adda794b519e1a8b031d9bacc9 Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.198646 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-053c-account-create-update-hg5sk"] Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.721173 4921 generic.go:334] "Generic (PLEG): container finished" podID="1c9e083a-07ce-4f69-87aa-0fe7589ddeff" containerID="94df89f5af2eabe34ca7f1487ad99b61eb23822abafe4a0d77816af0aa322849" exitCode=0 Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.721216 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-053c-account-create-update-hg5sk" event={"ID":"1c9e083a-07ce-4f69-87aa-0fe7589ddeff","Type":"ContainerDied","Data":"94df89f5af2eabe34ca7f1487ad99b61eb23822abafe4a0d77816af0aa322849"} Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.721258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-053c-account-create-update-hg5sk" event={"ID":"1c9e083a-07ce-4f69-87aa-0fe7589ddeff","Type":"ContainerStarted","Data":"bf19e4f7f2e3b27e2e47fca9a9ca238a715c330e58bbe7472ac51df83f348454"} Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.722485 4921 generic.go:334] "Generic (PLEG): container finished" podID="a4754cb8-9b80-4f83-a7eb-16895f1b3dee" containerID="ef7676a51c2301c5ce6d42d1d4cfa3e00deb06de83ac8c6c1cc3d8cb37fa41f9" exitCode=0 Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.722512 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-68l5k" event={"ID":"a4754cb8-9b80-4f83-a7eb-16895f1b3dee","Type":"ContainerDied","Data":"ef7676a51c2301c5ce6d42d1d4cfa3e00deb06de83ac8c6c1cc3d8cb37fa41f9"} Mar 18 13:43:52 crc kubenswrapper[4921]: I0318 13:43:52.722529 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-68l5k" event={"ID":"a4754cb8-9b80-4f83-a7eb-16895f1b3dee","Type":"ContainerStarted","Data":"e7e78c0586c3f170d72846bb687dd75e6ccb46adda794b519e1a8b031d9bacc9"} Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.206332 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-68l5k" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.308311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-operator-scripts\") pod \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.308429 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27k59\" (UniqueName: \"kubernetes.io/projected/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-kube-api-access-27k59\") pod \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\" (UID: \"a4754cb8-9b80-4f83-a7eb-16895f1b3dee\") " Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.308971 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4754cb8-9b80-4f83-a7eb-16895f1b3dee" (UID: "a4754cb8-9b80-4f83-a7eb-16895f1b3dee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.314269 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-kube-api-access-27k59" (OuterVolumeSpecName: "kube-api-access-27k59") pod "a4754cb8-9b80-4f83-a7eb-16895f1b3dee" (UID: "a4754cb8-9b80-4f83-a7eb-16895f1b3dee"). InnerVolumeSpecName "kube-api-access-27k59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.350521 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.409601 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-operator-scripts\") pod \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.409751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dzfm\" (UniqueName: \"kubernetes.io/projected/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-kube-api-access-2dzfm\") pod \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\" (UID: \"1c9e083a-07ce-4f69-87aa-0fe7589ddeff\") " Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.410208 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c9e083a-07ce-4f69-87aa-0fe7589ddeff" (UID: "1c9e083a-07ce-4f69-87aa-0fe7589ddeff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.410312 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.410331 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27k59\" (UniqueName: \"kubernetes.io/projected/a4754cb8-9b80-4f83-a7eb-16895f1b3dee-kube-api-access-27k59\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.410344 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.412458 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-kube-api-access-2dzfm" (OuterVolumeSpecName: "kube-api-access-2dzfm") pod "1c9e083a-07ce-4f69-87aa-0fe7589ddeff" (UID: "1c9e083a-07ce-4f69-87aa-0fe7589ddeff"). InnerVolumeSpecName "kube-api-access-2dzfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.512796 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dzfm\" (UniqueName: \"kubernetes.io/projected/1c9e083a-07ce-4f69-87aa-0fe7589ddeff-kube-api-access-2dzfm\") on node \"crc\" DevicePath \"\"" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.742596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-68l5k" event={"ID":"a4754cb8-9b80-4f83-a7eb-16895f1b3dee","Type":"ContainerDied","Data":"e7e78c0586c3f170d72846bb687dd75e6ccb46adda794b519e1a8b031d9bacc9"} Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.742641 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e78c0586c3f170d72846bb687dd75e6ccb46adda794b519e1a8b031d9bacc9" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.742709 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-68l5k" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.745225 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-053c-account-create-update-hg5sk" event={"ID":"1c9e083a-07ce-4f69-87aa-0fe7589ddeff","Type":"ContainerDied","Data":"bf19e4f7f2e3b27e2e47fca9a9ca238a715c330e58bbe7472ac51df83f348454"} Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.745251 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf19e4f7f2e3b27e2e47fca9a9ca238a715c330e58bbe7472ac51df83f348454" Mar 18 13:43:54 crc kubenswrapper[4921]: I0318 13:43:54.745311 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-053c-account-create-update-hg5sk" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.509042 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-8f6ph"] Mar 18 13:43:56 crc kubenswrapper[4921]: E0318 13:43:56.510707 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9e083a-07ce-4f69-87aa-0fe7589ddeff" containerName="mariadb-account-create-update" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.510781 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9e083a-07ce-4f69-87aa-0fe7589ddeff" containerName="mariadb-account-create-update" Mar 18 13:43:56 crc kubenswrapper[4921]: E0318 13:43:56.510843 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4754cb8-9b80-4f83-a7eb-16895f1b3dee" containerName="mariadb-database-create" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.510905 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4754cb8-9b80-4f83-a7eb-16895f1b3dee" containerName="mariadb-database-create" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.511211 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4754cb8-9b80-4f83-a7eb-16895f1b3dee" containerName="mariadb-database-create" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.511286 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9e083a-07ce-4f69-87aa-0fe7589ddeff" containerName="mariadb-account-create-update" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.512232 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.536079 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-8f6ph"] Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.547012 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmnk\" (UniqueName: \"kubernetes.io/projected/26284230-21f3-4580-ac05-967a2b6c63ac-kube-api-access-4tmnk\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.547104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-nb\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.547140 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-dns-svc\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.547159 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-sb\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.547219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-config\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.572678 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zc4tr"] Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.574892 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.577003 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.577529 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.577951 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g8fnk" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.590769 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zc4tr"] Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649163 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-nb\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649249 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-config-data\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649281 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-dns-svc\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-sb\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649326 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7md\" (UniqueName: \"kubernetes.io/projected/6b6f701e-7f40-4029-b66b-c840042d5058-kube-api-access-md7md\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649346 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-scripts\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649408 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-config\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649428 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b6f701e-7f40-4029-b66b-c840042d5058-logs\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649492 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmnk\" (UniqueName: \"kubernetes.io/projected/26284230-21f3-4580-ac05-967a2b6c63ac-kube-api-access-4tmnk\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.649568 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-combined-ca-bundle\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.650167 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-nb\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.650670 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-sb\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.650987 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-dns-svc\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.652818 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-config\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.675045 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmnk\" (UniqueName: \"kubernetes.io/projected/26284230-21f3-4580-ac05-967a2b6c63ac-kube-api-access-4tmnk\") pod \"dnsmasq-dns-85c649d7bf-8f6ph\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.750892 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-combined-ca-bundle\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.750965 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-config-data\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.750994 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md7md\" (UniqueName: \"kubernetes.io/projected/6b6f701e-7f40-4029-b66b-c840042d5058-kube-api-access-md7md\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.751016 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-scripts\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.751062 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b6f701e-7f40-4029-b66b-c840042d5058-logs\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.751956 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b6f701e-7f40-4029-b66b-c840042d5058-logs\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.754583 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-scripts\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.754605 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-config-data\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.754603 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-combined-ca-bundle\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.775672 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7md\" (UniqueName: \"kubernetes.io/projected/6b6f701e-7f40-4029-b66b-c840042d5058-kube-api-access-md7md\") pod \"placement-db-sync-zc4tr\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.838381 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:56 crc kubenswrapper[4921]: I0318 13:43:56.897573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zc4tr" Mar 18 13:43:57 crc kubenswrapper[4921]: W0318 13:43:57.294690 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26284230_21f3_4580_ac05_967a2b6c63ac.slice/crio-1197debbeaafe07252c2dd276443483ffd3afa31254498e49dad820e025fe3c3 WatchSource:0}: Error finding container 1197debbeaafe07252c2dd276443483ffd3afa31254498e49dad820e025fe3c3: Status 404 returned error can't find the container with id 1197debbeaafe07252c2dd276443483ffd3afa31254498e49dad820e025fe3c3 Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.296896 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-8f6ph"] Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.387407 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zc4tr"] Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.772744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zc4tr" event={"ID":"6b6f701e-7f40-4029-b66b-c840042d5058","Type":"ContainerStarted","Data":"1929ca627aecea3020096d032d97e843b923a9738afafa42b444d8167df160cb"} Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.773177 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zc4tr" event={"ID":"6b6f701e-7f40-4029-b66b-c840042d5058","Type":"ContainerStarted","Data":"69a8cf8023642fde428b69f1eefe55306d96bcd500f6495f805cca060e9cf53e"} Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.774026 4921 generic.go:334] "Generic (PLEG): container finished" podID="26284230-21f3-4580-ac05-967a2b6c63ac" containerID="60de801e0764e55095358917189e34a945936ba38e159b606379d4b4625325df" exitCode=0 Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.774057 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" event={"ID":"26284230-21f3-4580-ac05-967a2b6c63ac","Type":"ContainerDied","Data":"60de801e0764e55095358917189e34a945936ba38e159b606379d4b4625325df"} Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.774075 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" event={"ID":"26284230-21f3-4580-ac05-967a2b6c63ac","Type":"ContainerStarted","Data":"1197debbeaafe07252c2dd276443483ffd3afa31254498e49dad820e025fe3c3"} Mar 18 13:43:57 crc kubenswrapper[4921]: I0318 13:43:57.792068 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zc4tr" podStartSLOduration=1.792050954 podStartE2EDuration="1.792050954s" podCreationTimestamp="2026-03-18 13:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:57.788822811 +0000 UTC m=+5657.338743450" watchObservedRunningTime="2026-03-18 13:43:57.792050954 +0000 UTC m=+5657.341971583" Mar 18 13:43:58 crc kubenswrapper[4921]: I0318 13:43:58.784748 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" event={"ID":"26284230-21f3-4580-ac05-967a2b6c63ac","Type":"ContainerStarted","Data":"630f0fb1f6c1b5baad5beba796c852444f5afb0d92af123423726c1973ba554d"} Mar 18 13:43:58 crc kubenswrapper[4921]: I0318 13:43:58.784895 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:43:58 crc kubenswrapper[4921]: I0318 13:43:58.821144 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" podStartSLOduration=2.821091971 podStartE2EDuration="2.821091971s" podCreationTimestamp="2026-03-18 13:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:43:58.811382793 +0000 UTC m=+5658.361303442" watchObservedRunningTime="2026-03-18 13:43:58.821091971 +0000 UTC m=+5658.371012610" Mar 18 13:43:59 crc kubenswrapper[4921]: I0318 13:43:59.795811 4921 generic.go:334] "Generic (PLEG): container finished" podID="6b6f701e-7f40-4029-b66b-c840042d5058" containerID="1929ca627aecea3020096d032d97e843b923a9738afafa42b444d8167df160cb" exitCode=0 Mar 18 13:43:59 crc kubenswrapper[4921]: I0318 13:43:59.795928 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zc4tr" event={"ID":"6b6f701e-7f40-4029-b66b-c840042d5058","Type":"ContainerDied","Data":"1929ca627aecea3020096d032d97e843b923a9738afafa42b444d8167df160cb"} Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.137879 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564024-hps95"] Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.139081 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.144411 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-hps95"] Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.176826 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.177225 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.177263 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.236100 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnmx\" (UniqueName: \"kubernetes.io/projected/68fac776-8d23-4400-8243-3ef48f80b898-kube-api-access-nnnmx\") pod \"auto-csr-approver-29564024-hps95\" (UID: \"68fac776-8d23-4400-8243-3ef48f80b898\") " pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.338011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnmx\" (UniqueName: \"kubernetes.io/projected/68fac776-8d23-4400-8243-3ef48f80b898-kube-api-access-nnnmx\") pod \"auto-csr-approver-29564024-hps95\" (UID: \"68fac776-8d23-4400-8243-3ef48f80b898\") " pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.358723 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnmx\" (UniqueName: \"kubernetes.io/projected/68fac776-8d23-4400-8243-3ef48f80b898-kube-api-access-nnnmx\") pod \"auto-csr-approver-29564024-hps95\" (UID: \"68fac776-8d23-4400-8243-3ef48f80b898\") " pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.506022 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:00 crc kubenswrapper[4921]: I0318 13:44:00.950486 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-hps95"] Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.087390 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zc4tr" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.150061 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-scripts\") pod \"6b6f701e-7f40-4029-b66b-c840042d5058\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.150246 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-combined-ca-bundle\") pod \"6b6f701e-7f40-4029-b66b-c840042d5058\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.150312 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b6f701e-7f40-4029-b66b-c840042d5058-logs\") pod \"6b6f701e-7f40-4029-b66b-c840042d5058\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.150330 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-config-data\") pod \"6b6f701e-7f40-4029-b66b-c840042d5058\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.150354 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md7md\" (UniqueName: \"kubernetes.io/projected/6b6f701e-7f40-4029-b66b-c840042d5058-kube-api-access-md7md\") pod \"6b6f701e-7f40-4029-b66b-c840042d5058\" (UID: \"6b6f701e-7f40-4029-b66b-c840042d5058\") " Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.151757 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6f701e-7f40-4029-b66b-c840042d5058-logs" (OuterVolumeSpecName: "logs") pod "6b6f701e-7f40-4029-b66b-c840042d5058" (UID: "6b6f701e-7f40-4029-b66b-c840042d5058"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.155644 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-scripts" (OuterVolumeSpecName: "scripts") pod "6b6f701e-7f40-4029-b66b-c840042d5058" (UID: "6b6f701e-7f40-4029-b66b-c840042d5058"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.155993 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6f701e-7f40-4029-b66b-c840042d5058-kube-api-access-md7md" (OuterVolumeSpecName: "kube-api-access-md7md") pod "6b6f701e-7f40-4029-b66b-c840042d5058" (UID: "6b6f701e-7f40-4029-b66b-c840042d5058"). InnerVolumeSpecName "kube-api-access-md7md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.176330 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-config-data" (OuterVolumeSpecName: "config-data") pod "6b6f701e-7f40-4029-b66b-c840042d5058" (UID: "6b6f701e-7f40-4029-b66b-c840042d5058"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.177095 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b6f701e-7f40-4029-b66b-c840042d5058" (UID: "6b6f701e-7f40-4029-b66b-c840042d5058"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.251911 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b6f701e-7f40-4029-b66b-c840042d5058-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.251956 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.251970 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md7md\" (UniqueName: \"kubernetes.io/projected/6b6f701e-7f40-4029-b66b-c840042d5058-kube-api-access-md7md\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.251982 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.251995 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6f701e-7f40-4029-b66b-c840042d5058-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.813757 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-hps95" event={"ID":"68fac776-8d23-4400-8243-3ef48f80b898","Type":"ContainerStarted","Data":"5bd4aa35285d9c1b6aaf3c3fc7254bef8f35720fd68b8fa060fb2e7c40fda315"} Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.815469 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zc4tr" event={"ID":"6b6f701e-7f40-4029-b66b-c840042d5058","Type":"ContainerDied","Data":"69a8cf8023642fde428b69f1eefe55306d96bcd500f6495f805cca060e9cf53e"} Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.815496 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69a8cf8023642fde428b69f1eefe55306d96bcd500f6495f805cca060e9cf53e" Mar 18 13:44:01 crc kubenswrapper[4921]: I0318 13:44:01.815565 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zc4tr" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.003899 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c769655b6-7qjkh"] Mar 18 13:44:02 crc kubenswrapper[4921]: E0318 13:44:02.004916 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f701e-7f40-4029-b66b-c840042d5058" containerName="placement-db-sync" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.005023 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f701e-7f40-4029-b66b-c840042d5058" containerName="placement-db-sync" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.005388 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6f701e-7f40-4029-b66b-c840042d5058" containerName="placement-db-sync" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.006908 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.009996 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.010022 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g8fnk" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.009996 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.012624 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c769655b6-7qjkh"] Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.065349 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9fm7\" (UniqueName: \"kubernetes.io/projected/da67364d-59f5-4049-b4a8-262b6ccf4eb5-kube-api-access-s9fm7\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.065649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-scripts\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.065810 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67364d-59f5-4049-b4a8-262b6ccf4eb5-logs\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.065986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-combined-ca-bundle\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.066139 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-config-data\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.167371 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9fm7\" (UniqueName: \"kubernetes.io/projected/da67364d-59f5-4049-b4a8-262b6ccf4eb5-kube-api-access-s9fm7\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.167638 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-scripts\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.167716 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67364d-59f5-4049-b4a8-262b6ccf4eb5-logs\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.167813 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-combined-ca-bundle\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.167928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-config-data\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.169132 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da67364d-59f5-4049-b4a8-262b6ccf4eb5-logs\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.173776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-config-data\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.174759 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-combined-ca-bundle\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.175089 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da67364d-59f5-4049-b4a8-262b6ccf4eb5-scripts\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.187036 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9fm7\" (UniqueName: \"kubernetes.io/projected/da67364d-59f5-4049-b4a8-262b6ccf4eb5-kube-api-access-s9fm7\") pod \"placement-c769655b6-7qjkh\" (UID: \"da67364d-59f5-4049-b4a8-262b6ccf4eb5\") " pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.378332 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.825555 4921 generic.go:334] "Generic (PLEG): container finished" podID="68fac776-8d23-4400-8243-3ef48f80b898" containerID="3f3776e8291f235fd28f10011807d0e6f79a195e770d2e905b94be9ddf9e6b09" exitCode=0 Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.825613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-hps95" event={"ID":"68fac776-8d23-4400-8243-3ef48f80b898","Type":"ContainerDied","Data":"3f3776e8291f235fd28f10011807d0e6f79a195e770d2e905b94be9ddf9e6b09"} Mar 18 13:44:02 crc kubenswrapper[4921]: I0318 13:44:02.862277 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c769655b6-7qjkh"] Mar 18 13:44:02 crc kubenswrapper[4921]: W0318 13:44:02.867922 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda67364d_59f5_4049_b4a8_262b6ccf4eb5.slice/crio-fd1187e7eed8b62f156d16ae352e5736fe23a3f58e5b5c173e3834725f1e2619 WatchSource:0}: Error finding container fd1187e7eed8b62f156d16ae352e5736fe23a3f58e5b5c173e3834725f1e2619: Status 404 returned error can't find the container with id fd1187e7eed8b62f156d16ae352e5736fe23a3f58e5b5c173e3834725f1e2619 Mar 18 13:44:03 crc kubenswrapper[4921]: I0318 13:44:03.845350 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c769655b6-7qjkh" event={"ID":"da67364d-59f5-4049-b4a8-262b6ccf4eb5","Type":"ContainerStarted","Data":"6aeb0cd1b434006ccac134844a7722150d573dee52a56763dc21591bfd7f03e6"} Mar 18 13:44:03 crc kubenswrapper[4921]: I0318 13:44:03.845722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c769655b6-7qjkh" event={"ID":"da67364d-59f5-4049-b4a8-262b6ccf4eb5","Type":"ContainerStarted","Data":"4a3f926e34a284796454a9b5b91fce6ed5384b8011e02f42dc8ab7ef93afa4fa"} Mar 18 13:44:03 crc kubenswrapper[4921]: I0318 13:44:03.845768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c769655b6-7qjkh" event={"ID":"da67364d-59f5-4049-b4a8-262b6ccf4eb5","Type":"ContainerStarted","Data":"fd1187e7eed8b62f156d16ae352e5736fe23a3f58e5b5c173e3834725f1e2619"} Mar 18 13:44:03 crc kubenswrapper[4921]: I0318 13:44:03.845790 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:03 crc kubenswrapper[4921]: I0318 13:44:03.845808 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:03 crc kubenswrapper[4921]: I0318 13:44:03.877356 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c769655b6-7qjkh" podStartSLOduration=2.877336049 podStartE2EDuration="2.877336049s" podCreationTimestamp="2026-03-18 13:44:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:44:03.86022104 +0000 UTC m=+5663.410141679" watchObservedRunningTime="2026-03-18 13:44:03.877336049 +0000 UTC m=+5663.427256678" Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.174088 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.201428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnnmx\" (UniqueName: \"kubernetes.io/projected/68fac776-8d23-4400-8243-3ef48f80b898-kube-api-access-nnnmx\") pod \"68fac776-8d23-4400-8243-3ef48f80b898\" (UID: \"68fac776-8d23-4400-8243-3ef48f80b898\") " Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.207011 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fac776-8d23-4400-8243-3ef48f80b898-kube-api-access-nnnmx" (OuterVolumeSpecName: "kube-api-access-nnnmx") pod "68fac776-8d23-4400-8243-3ef48f80b898" (UID: "68fac776-8d23-4400-8243-3ef48f80b898"). InnerVolumeSpecName "kube-api-access-nnnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.304461 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnnmx\" (UniqueName: \"kubernetes.io/projected/68fac776-8d23-4400-8243-3ef48f80b898-kube-api-access-nnnmx\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.856445 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-hps95" Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.856415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-hps95" event={"ID":"68fac776-8d23-4400-8243-3ef48f80b898","Type":"ContainerDied","Data":"5bd4aa35285d9c1b6aaf3c3fc7254bef8f35720fd68b8fa060fb2e7c40fda315"} Mar 18 13:44:04 crc kubenswrapper[4921]: I0318 13:44:04.857079 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bd4aa35285d9c1b6aaf3c3fc7254bef8f35720fd68b8fa060fb2e7c40fda315" Mar 18 13:44:05 crc kubenswrapper[4921]: I0318 13:44:05.250630 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-qln2k"] Mar 18 13:44:05 crc kubenswrapper[4921]: I0318 13:44:05.259090 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-qln2k"] Mar 18 13:44:06 crc kubenswrapper[4921]: I0318 13:44:06.840408 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:44:06 crc kubenswrapper[4921]: I0318 13:44:06.901132 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-gxnbj"] Mar 18 13:44:06 crc kubenswrapper[4921]: I0318 13:44:06.901383 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerName="dnsmasq-dns" containerID="cri-o://d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb" gracePeriod=10 Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.224833 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3eea75-7a37-491f-969e-fb42b499ed50" path="/var/lib/kubelet/pods/0b3eea75-7a37-491f-969e-fb42b499ed50/volumes" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.385836 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.462542 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-sb\") pod \"7dcdc436-edb5-44cd-97c5-08e25953972b\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.462716 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-dns-svc\") pod \"7dcdc436-edb5-44cd-97c5-08e25953972b\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.462760 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-config\") pod \"7dcdc436-edb5-44cd-97c5-08e25953972b\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.462832 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-nb\") pod \"7dcdc436-edb5-44cd-97c5-08e25953972b\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.462860 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9mpx\" (UniqueName: \"kubernetes.io/projected/7dcdc436-edb5-44cd-97c5-08e25953972b-kube-api-access-k9mpx\") pod \"7dcdc436-edb5-44cd-97c5-08e25953972b\" (UID: \"7dcdc436-edb5-44cd-97c5-08e25953972b\") " Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.468174 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcdc436-edb5-44cd-97c5-08e25953972b-kube-api-access-k9mpx" (OuterVolumeSpecName: "kube-api-access-k9mpx") pod "7dcdc436-edb5-44cd-97c5-08e25953972b" (UID: "7dcdc436-edb5-44cd-97c5-08e25953972b"). InnerVolumeSpecName "kube-api-access-k9mpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.505618 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7dcdc436-edb5-44cd-97c5-08e25953972b" (UID: "7dcdc436-edb5-44cd-97c5-08e25953972b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.505666 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7dcdc436-edb5-44cd-97c5-08e25953972b" (UID: "7dcdc436-edb5-44cd-97c5-08e25953972b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.505969 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7dcdc436-edb5-44cd-97c5-08e25953972b" (UID: "7dcdc436-edb5-44cd-97c5-08e25953972b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.507489 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-config" (OuterVolumeSpecName: "config") pod "7dcdc436-edb5-44cd-97c5-08e25953972b" (UID: "7dcdc436-edb5-44cd-97c5-08e25953972b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.565218 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.565262 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.565275 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.565289 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dcdc436-edb5-44cd-97c5-08e25953972b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.565302 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9mpx\" (UniqueName: \"kubernetes.io/projected/7dcdc436-edb5-44cd-97c5-08e25953972b-kube-api-access-k9mpx\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.894944 4921 generic.go:334] "Generic (PLEG): container finished" podID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerID="d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb" exitCode=0 Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.894988 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" event={"ID":"7dcdc436-edb5-44cd-97c5-08e25953972b","Type":"ContainerDied","Data":"d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb"} Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.895042 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" event={"ID":"7dcdc436-edb5-44cd-97c5-08e25953972b","Type":"ContainerDied","Data":"60f82e4dbed34628c531d8f3911a3d5a469c0048135869d0181928b715561b1a"} Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.895042 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8565f7649c-gxnbj" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.895065 4921 scope.go:117] "RemoveContainer" containerID="d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.917979 4921 scope.go:117] "RemoveContainer" containerID="10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.939205 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-gxnbj"] Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.941238 4921 scope.go:117] "RemoveContainer" containerID="d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb" Mar 18 13:44:07 crc kubenswrapper[4921]: E0318 13:44:07.941779 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb\": container with ID starting with d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb not found: ID does not exist" containerID="d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.941850 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb"} err="failed to get container status \"d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb\": rpc error: code = NotFound desc = could not find container \"d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb\": container with ID starting with d3c2ce353b36acd4770148d8c5c5c32fc36eb251d30a52afaa8781b833fff5bb not found: ID does not exist" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.941887 4921 scope.go:117] "RemoveContainer" containerID="10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f" Mar 18 13:44:07 crc kubenswrapper[4921]: E0318 13:44:07.942286 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f\": container with ID starting with 10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f not found: ID does not exist" containerID="10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.942333 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f"} err="failed to get container status \"10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f\": rpc error: code = NotFound desc = could not find container \"10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f\": container with ID starting with 10a83d10c03a830d5f86900f7f7041f0fbd8fc35b74e0f9ca978b99cce3ae86f not found: ID does not exist" Mar 18 13:44:07 crc kubenswrapper[4921]: I0318 13:44:07.946171 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8565f7649c-gxnbj"] Mar 18 13:44:08 crc kubenswrapper[4921]: I0318 13:44:08.332758 4921 scope.go:117] "RemoveContainer" containerID="0be02c9084b987d3f4d3b22d032167502c9334c2620a589cbaafe815a0ba47ad" Mar 18 13:44:08 crc kubenswrapper[4921]: I0318 13:44:08.381211 4921 scope.go:117] "RemoveContainer" containerID="686b62f7964b4d79ab1b9e641d0d3f6df610ee2bb22718bd2feed005e9688a46" Mar 18 13:44:08 crc kubenswrapper[4921]: I0318 13:44:08.419537 4921 scope.go:117] "RemoveContainer" containerID="74205b39a361119a37adf1c916397297fe4f1d0a31dd3d1f01334c12bb92c426" Mar 18 13:44:09 crc kubenswrapper[4921]: I0318 13:44:09.220743 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" path="/var/lib/kubelet/pods/7dcdc436-edb5-44cd-97c5-08e25953972b/volumes" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.516306 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-27cnm"] Mar 18 13:44:11 crc kubenswrapper[4921]: E0318 13:44:11.517134 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerName="dnsmasq-dns" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.517154 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerName="dnsmasq-dns" Mar 18 13:44:11 crc kubenswrapper[4921]: E0318 13:44:11.517178 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerName="init" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.517187 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerName="init" Mar 18 13:44:11 crc kubenswrapper[4921]: E0318 13:44:11.517224 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fac776-8d23-4400-8243-3ef48f80b898" containerName="oc" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.517234 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fac776-8d23-4400-8243-3ef48f80b898" containerName="oc" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.517440 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcdc436-edb5-44cd-97c5-08e25953972b" containerName="dnsmasq-dns" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.517472 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fac776-8d23-4400-8243-3ef48f80b898" containerName="oc" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.519664 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.533050 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27cnm"] Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.650024 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmqz\" (UniqueName: \"kubernetes.io/projected/37c78670-a990-4344-b309-b33494c796b3-kube-api-access-npmqz\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.650073 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-catalog-content\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.650250 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-utilities\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.751701 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-utilities\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.751817 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npmqz\" (UniqueName: \"kubernetes.io/projected/37c78670-a990-4344-b309-b33494c796b3-kube-api-access-npmqz\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.751849 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-catalog-content\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.752260 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-utilities\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.752452 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-catalog-content\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.772540 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmqz\" (UniqueName: \"kubernetes.io/projected/37c78670-a990-4344-b309-b33494c796b3-kube-api-access-npmqz\") pod \"certified-operators-27cnm\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:11 crc kubenswrapper[4921]: I0318 13:44:11.844602 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:12 crc kubenswrapper[4921]: I0318 13:44:12.344868 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27cnm"] Mar 18 13:44:12 crc kubenswrapper[4921]: I0318 13:44:12.941755 4921 generic.go:334] "Generic (PLEG): container finished" podID="37c78670-a990-4344-b309-b33494c796b3" containerID="9712368b8626823ca93b249edf812c8fef5606f7cf21767a2de5ed0b7acbd2d7" exitCode=0 Mar 18 13:44:12 crc kubenswrapper[4921]: I0318 13:44:12.941844 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27cnm" event={"ID":"37c78670-a990-4344-b309-b33494c796b3","Type":"ContainerDied","Data":"9712368b8626823ca93b249edf812c8fef5606f7cf21767a2de5ed0b7acbd2d7"} Mar 18 13:44:12 crc kubenswrapper[4921]: I0318 13:44:12.942145 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27cnm" event={"ID":"37c78670-a990-4344-b309-b33494c796b3","Type":"ContainerStarted","Data":"3a2a84d28c83374cf92be0029be4e3186b7b670786496d9c579cc124ddd6f3e7"} Mar 18 13:44:13 crc kubenswrapper[4921]: I0318 13:44:13.953839 4921 generic.go:334] "Generic (PLEG): container finished" podID="37c78670-a990-4344-b309-b33494c796b3" containerID="25d8dbc9cdc15f1e7dffb359d01db0ce4196c03fc618696bf43d2ac181f0106b" exitCode=0 Mar 18 13:44:13 crc kubenswrapper[4921]: I0318 13:44:13.953956 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27cnm" event={"ID":"37c78670-a990-4344-b309-b33494c796b3","Type":"ContainerDied","Data":"25d8dbc9cdc15f1e7dffb359d01db0ce4196c03fc618696bf43d2ac181f0106b"} Mar 18 13:44:15 crc kubenswrapper[4921]: I0318 13:44:15.978042 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27cnm" event={"ID":"37c78670-a990-4344-b309-b33494c796b3","Type":"ContainerStarted","Data":"4df3c4cf59a767ce2abb56b30f034dbc7220203ab64472e11d60d63ed6b107a0"} Mar 18 13:44:16 crc kubenswrapper[4921]: I0318 13:44:16.001886 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-27cnm" podStartSLOduration=2.894270546 podStartE2EDuration="5.001862915s" podCreationTimestamp="2026-03-18 13:44:11 +0000 UTC" firstStartedPulling="2026-03-18 13:44:12.945947553 +0000 UTC m=+5672.495868192" lastFinishedPulling="2026-03-18 13:44:15.053539912 +0000 UTC m=+5674.603460561" observedRunningTime="2026-03-18 13:44:15.997720007 +0000 UTC m=+5675.547640646" watchObservedRunningTime="2026-03-18 13:44:16.001862915 +0000 UTC m=+5675.551783554" Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.081428 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.081736 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.081772 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.082452 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.082502 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" gracePeriod=600 Mar 18 13:44:17 crc kubenswrapper[4921]: E0318 13:44:17.217586 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.998618 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" exitCode=0 Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.998663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d"} Mar 18 13:44:17 crc kubenswrapper[4921]: I0318 13:44:17.998701 4921 scope.go:117] "RemoveContainer" containerID="925f41448af3066f1f8956601297f08a71fa65c35e3c953610bfb65d6cea1e9b" Mar 18 13:44:18 crc kubenswrapper[4921]: I0318 13:44:18.000243 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:44:18 crc kubenswrapper[4921]: E0318 13:44:18.000727 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:44:21 crc kubenswrapper[4921]: I0318 13:44:21.845585 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:21 crc kubenswrapper[4921]: I0318 13:44:21.846249 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:21 crc kubenswrapper[4921]: I0318 13:44:21.905154 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:22 crc kubenswrapper[4921]: I0318 13:44:22.081539 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:22 crc kubenswrapper[4921]: I0318 13:44:22.142508 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27cnm"] Mar 18 13:44:24 crc kubenswrapper[4921]: I0318 13:44:24.054003 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-27cnm" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="registry-server" containerID="cri-o://4df3c4cf59a767ce2abb56b30f034dbc7220203ab64472e11d60d63ed6b107a0" gracePeriod=2 Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.065599 4921 generic.go:334] "Generic (PLEG): container finished" podID="37c78670-a990-4344-b309-b33494c796b3" containerID="4df3c4cf59a767ce2abb56b30f034dbc7220203ab64472e11d60d63ed6b107a0" exitCode=0 Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.066919 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27cnm" event={"ID":"37c78670-a990-4344-b309-b33494c796b3","Type":"ContainerDied","Data":"4df3c4cf59a767ce2abb56b30f034dbc7220203ab64472e11d60d63ed6b107a0"} Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.659723 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.717327 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-utilities\") pod \"37c78670-a990-4344-b309-b33494c796b3\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.717432 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npmqz\" (UniqueName: \"kubernetes.io/projected/37c78670-a990-4344-b309-b33494c796b3-kube-api-access-npmqz\") pod \"37c78670-a990-4344-b309-b33494c796b3\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.717549 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-catalog-content\") pod \"37c78670-a990-4344-b309-b33494c796b3\" (UID: \"37c78670-a990-4344-b309-b33494c796b3\") " Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.718864 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-utilities" (OuterVolumeSpecName: "utilities") pod "37c78670-a990-4344-b309-b33494c796b3" (UID: "37c78670-a990-4344-b309-b33494c796b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.722835 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c78670-a990-4344-b309-b33494c796b3-kube-api-access-npmqz" (OuterVolumeSpecName: "kube-api-access-npmqz") pod "37c78670-a990-4344-b309-b33494c796b3" (UID: "37c78670-a990-4344-b309-b33494c796b3"). InnerVolumeSpecName "kube-api-access-npmqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.763305 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37c78670-a990-4344-b309-b33494c796b3" (UID: "37c78670-a990-4344-b309-b33494c796b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.820780 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.820858 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c78670-a990-4344-b309-b33494c796b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:25 crc kubenswrapper[4921]: I0318 13:44:25.820870 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npmqz\" (UniqueName: \"kubernetes.io/projected/37c78670-a990-4344-b309-b33494c796b3-kube-api-access-npmqz\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.077894 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27cnm" event={"ID":"37c78670-a990-4344-b309-b33494c796b3","Type":"ContainerDied","Data":"3a2a84d28c83374cf92be0029be4e3186b7b670786496d9c579cc124ddd6f3e7"} Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.077962 4921 scope.go:117] "RemoveContainer" containerID="4df3c4cf59a767ce2abb56b30f034dbc7220203ab64472e11d60d63ed6b107a0" Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.078920 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27cnm" Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.101291 4921 scope.go:117] "RemoveContainer" containerID="25d8dbc9cdc15f1e7dffb359d01db0ce4196c03fc618696bf43d2ac181f0106b" Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.125384 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27cnm"] Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.133106 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-27cnm"] Mar 18 13:44:26 crc kubenswrapper[4921]: I0318 13:44:26.142504 4921 scope.go:117] "RemoveContainer" containerID="9712368b8626823ca93b249edf812c8fef5606f7cf21767a2de5ed0b7acbd2d7" Mar 18 13:44:27 crc kubenswrapper[4921]: I0318 13:44:27.219634 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c78670-a990-4344-b309-b33494c796b3" path="/var/lib/kubelet/pods/37c78670-a990-4344-b309-b33494c796b3/volumes" Mar 18 13:44:30 crc kubenswrapper[4921]: I0318 13:44:30.209759 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:44:30 crc kubenswrapper[4921]: E0318 13:44:30.210257 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:44:33 crc kubenswrapper[4921]: I0318 13:44:33.537844 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:33 crc kubenswrapper[4921]: I0318 13:44:33.547193 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c769655b6-7qjkh" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.703383 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwmql"] Mar 18 13:44:34 crc kubenswrapper[4921]: E0318 13:44:34.703995 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="registry-server" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.704014 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="registry-server" Mar 18 13:44:34 crc kubenswrapper[4921]: E0318 13:44:34.704044 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="extract-content" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.704051 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="extract-content" Mar 18 13:44:34 crc kubenswrapper[4921]: E0318 13:44:34.704070 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="extract-utilities" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.704077 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="extract-utilities" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.704299 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c78670-a990-4344-b309-b33494c796b3" containerName="registry-server" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.705890 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.717164 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwmql"] Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.824924 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-catalog-content\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.824992 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-utilities\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.825039 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2brq9\" (UniqueName: \"kubernetes.io/projected/5326024d-7481-45c4-8259-835eb43f8059-kube-api-access-2brq9\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.926905 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-catalog-content\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.926989 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-utilities\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.927025 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2brq9\" (UniqueName: \"kubernetes.io/projected/5326024d-7481-45c4-8259-835eb43f8059-kube-api-access-2brq9\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.927888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-catalog-content\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.928018 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-utilities\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:34 crc kubenswrapper[4921]: I0318 13:44:34.951549 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2brq9\" (UniqueName: \"kubernetes.io/projected/5326024d-7481-45c4-8259-835eb43f8059-kube-api-access-2brq9\") pod \"redhat-marketplace-dwmql\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:35 crc kubenswrapper[4921]: I0318 13:44:35.038614 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:35 crc kubenswrapper[4921]: I0318 13:44:35.524850 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwmql"] Mar 18 13:44:35 crc kubenswrapper[4921]: W0318 13:44:35.528603 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5326024d_7481_45c4_8259_835eb43f8059.slice/crio-ca39d886a971666fb8066d11de22925f596d979f98315d2fdeb0b52801659102 WatchSource:0}: Error finding container ca39d886a971666fb8066d11de22925f596d979f98315d2fdeb0b52801659102: Status 404 returned error can't find the container with id ca39d886a971666fb8066d11de22925f596d979f98315d2fdeb0b52801659102 Mar 18 13:44:36 crc kubenswrapper[4921]: I0318 13:44:36.217883 4921 generic.go:334] "Generic (PLEG): container finished" podID="5326024d-7481-45c4-8259-835eb43f8059" containerID="9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789" exitCode=0 Mar 18 13:44:36 crc kubenswrapper[4921]: I0318 13:44:36.217940 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwmql" event={"ID":"5326024d-7481-45c4-8259-835eb43f8059","Type":"ContainerDied","Data":"9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789"} Mar 18 13:44:36 crc kubenswrapper[4921]: I0318 13:44:36.218206 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwmql" event={"ID":"5326024d-7481-45c4-8259-835eb43f8059","Type":"ContainerStarted","Data":"ca39d886a971666fb8066d11de22925f596d979f98315d2fdeb0b52801659102"} Mar 18 13:44:38 crc kubenswrapper[4921]: I0318 13:44:38.239451 4921 generic.go:334] "Generic (PLEG): container finished" podID="5326024d-7481-45c4-8259-835eb43f8059" containerID="014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78" exitCode=0 Mar 18 13:44:38 crc kubenswrapper[4921]: I0318 13:44:38.239512 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwmql" event={"ID":"5326024d-7481-45c4-8259-835eb43f8059","Type":"ContainerDied","Data":"014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78"} Mar 18 13:44:39 crc kubenswrapper[4921]: I0318 13:44:39.253531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwmql" event={"ID":"5326024d-7481-45c4-8259-835eb43f8059","Type":"ContainerStarted","Data":"754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb"} Mar 18 13:44:40 crc kubenswrapper[4921]: I0318 13:44:40.291371 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwmql" podStartSLOduration=3.561983562 podStartE2EDuration="6.291346961s" podCreationTimestamp="2026-03-18 13:44:34 +0000 UTC" firstStartedPulling="2026-03-18 13:44:36.219966679 +0000 UTC m=+5695.769887318" lastFinishedPulling="2026-03-18 13:44:38.949330078 +0000 UTC m=+5698.499250717" observedRunningTime="2026-03-18 13:44:40.281951243 +0000 UTC m=+5699.831871922" watchObservedRunningTime="2026-03-18 13:44:40.291346961 +0000 UTC m=+5699.841267600" Mar 18 13:44:41 crc kubenswrapper[4921]: I0318 13:44:41.228768 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:44:41 crc kubenswrapper[4921]: E0318 13:44:41.229000 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:44:45 crc kubenswrapper[4921]: I0318 13:44:45.039607 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:45 crc kubenswrapper[4921]: I0318 13:44:45.040230 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:45 crc kubenswrapper[4921]: I0318 13:44:45.086194 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:45 crc kubenswrapper[4921]: I0318 13:44:45.360352 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:45 crc kubenswrapper[4921]: I0318 13:44:45.414606 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwmql"] Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.320597 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwmql" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="registry-server" containerID="cri-o://754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb" gracePeriod=2 Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.810127 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.906796 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-catalog-content\") pod \"5326024d-7481-45c4-8259-835eb43f8059\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.906872 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2brq9\" (UniqueName: \"kubernetes.io/projected/5326024d-7481-45c4-8259-835eb43f8059-kube-api-access-2brq9\") pod \"5326024d-7481-45c4-8259-835eb43f8059\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.907069 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-utilities\") pod \"5326024d-7481-45c4-8259-835eb43f8059\" (UID: \"5326024d-7481-45c4-8259-835eb43f8059\") " Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.907923 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-utilities" (OuterVolumeSpecName: "utilities") pod "5326024d-7481-45c4-8259-835eb43f8059" (UID: "5326024d-7481-45c4-8259-835eb43f8059"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.914580 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5326024d-7481-45c4-8259-835eb43f8059-kube-api-access-2brq9" (OuterVolumeSpecName: "kube-api-access-2brq9") pod "5326024d-7481-45c4-8259-835eb43f8059" (UID: "5326024d-7481-45c4-8259-835eb43f8059"). InnerVolumeSpecName "kube-api-access-2brq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:47 crc kubenswrapper[4921]: I0318 13:44:47.954828 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5326024d-7481-45c4-8259-835eb43f8059" (UID: "5326024d-7481-45c4-8259-835eb43f8059"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.009539 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.009579 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5326024d-7481-45c4-8259-835eb43f8059-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.009596 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2brq9\" (UniqueName: \"kubernetes.io/projected/5326024d-7481-45c4-8259-835eb43f8059-kube-api-access-2brq9\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.332162 4921 generic.go:334] "Generic (PLEG): container finished" podID="5326024d-7481-45c4-8259-835eb43f8059" containerID="754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb" exitCode=0 Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.332210 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwmql" event={"ID":"5326024d-7481-45c4-8259-835eb43f8059","Type":"ContainerDied","Data":"754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb"} Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.333215 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwmql" event={"ID":"5326024d-7481-45c4-8259-835eb43f8059","Type":"ContainerDied","Data":"ca39d886a971666fb8066d11de22925f596d979f98315d2fdeb0b52801659102"} Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.333240 4921 scope.go:117] "RemoveContainer" containerID="754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.332239 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwmql" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.369924 4921 scope.go:117] "RemoveContainer" containerID="014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.374555 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwmql"] Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.391680 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwmql"] Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.414396 4921 scope.go:117] "RemoveContainer" containerID="9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.444595 4921 scope.go:117] "RemoveContainer" containerID="754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb" Mar 18 13:44:48 crc kubenswrapper[4921]: E0318 13:44:48.445133 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb\": container with ID starting with 754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb not found: ID does not exist" containerID="754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.445234 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb"} err="failed to get container status \"754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb\": rpc error: code = NotFound desc = could not find container \"754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb\": container with ID starting with 754be29a84fec78cb3956517fb0736639dd31706c5a5cf03911c2164b5f0e6cb not found: ID does not exist" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.445319 4921 scope.go:117] "RemoveContainer" containerID="014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78" Mar 18 13:44:48 crc kubenswrapper[4921]: E0318 13:44:48.445742 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78\": container with ID starting with 014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78 not found: ID does not exist" containerID="014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.445875 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78"} err="failed to get container status \"014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78\": rpc error: code = NotFound desc = could not find container \"014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78\": container with ID starting with 014fc0ac2b0b573450432c32348029a9a0ee2b12b0eef0257f1b72523d863b78 not found: ID does not exist" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.445942 4921 scope.go:117] "RemoveContainer" containerID="9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789" Mar 18 13:44:48 crc kubenswrapper[4921]: E0318 13:44:48.446362 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789\": container with ID starting with 9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789 not found: ID does not exist" containerID="9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789" Mar 18 13:44:48 crc kubenswrapper[4921]: I0318 13:44:48.446458 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789"} err="failed to get container status \"9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789\": rpc error: code = NotFound desc = could not find container \"9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789\": container with ID starting with 9af672ccfa38dceed718e8cffbc7958490984013549351e51cede8c380011789 not found: ID does not exist" Mar 18 13:44:49 crc kubenswrapper[4921]: I0318 13:44:49.224485 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5326024d-7481-45c4-8259-835eb43f8059" path="/var/lib/kubelet/pods/5326024d-7481-45c4-8259-835eb43f8059/volumes" Mar 18 13:44:52 crc kubenswrapper[4921]: I0318 13:44:52.209702 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:44:52 crc kubenswrapper[4921]: E0318 13:44:52.210334 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.988629 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vbtxw"] Mar 18 13:44:56 crc kubenswrapper[4921]: E0318 13:44:56.989673 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="registry-server" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.989694 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="registry-server" Mar 18 13:44:56 crc kubenswrapper[4921]: E0318 13:44:56.989717 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="extract-utilities" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.989725 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="extract-utilities" Mar 18 13:44:56 crc kubenswrapper[4921]: E0318 13:44:56.989756 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="extract-content" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.989763 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="extract-content" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.989975 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5326024d-7481-45c4-8259-835eb43f8059" containerName="registry-server" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.990702 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:56 crc kubenswrapper[4921]: I0318 13:44:56.998088 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vbtxw"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.081077 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d136526d-eb62-4c8e-a26e-250933f5f0f4-operator-scripts\") pod \"nova-api-db-create-vbtxw\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.081182 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpkj7\" (UniqueName: \"kubernetes.io/projected/d136526d-eb62-4c8e-a26e-250933f5f0f4-kube-api-access-gpkj7\") pod \"nova-api-db-create-vbtxw\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.182597 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d136526d-eb62-4c8e-a26e-250933f5f0f4-operator-scripts\") pod \"nova-api-db-create-vbtxw\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.182914 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpkj7\" (UniqueName: \"kubernetes.io/projected/d136526d-eb62-4c8e-a26e-250933f5f0f4-kube-api-access-gpkj7\") pod \"nova-api-db-create-vbtxw\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.183579 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d136526d-eb62-4c8e-a26e-250933f5f0f4-operator-scripts\") pod \"nova-api-db-create-vbtxw\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.184474 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6k79j"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.186064 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.193730 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-755e-account-create-update-bssdk"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.195137 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.201148 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.201240 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6k79j"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.218369 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpkj7\" (UniqueName: \"kubernetes.io/projected/d136526d-eb62-4c8e-a26e-250933f5f0f4-kube-api-access-gpkj7\") pod \"nova-api-db-create-vbtxw\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.220258 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-755e-account-create-update-bssdk"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.285809 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-operator-scripts\") pod \"nova-cell0-db-create-6k79j\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.286463 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75fe9e8d-d964-497d-8956-6ab4b35e8c79-operator-scripts\") pod \"nova-api-755e-account-create-update-bssdk\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.286622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wwdh\" (UniqueName: \"kubernetes.io/projected/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-kube-api-access-7wwdh\") pod \"nova-cell0-db-create-6k79j\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.286844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgfr\" (UniqueName: \"kubernetes.io/projected/75fe9e8d-d964-497d-8956-6ab4b35e8c79-kube-api-access-xhgfr\") pod \"nova-api-755e-account-create-update-bssdk\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.291715 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dp7ds"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.293041 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.307553 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dp7ds"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.309653 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.388492 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-operator-scripts\") pod \"nova-cell0-db-create-6k79j\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.388533 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75fe9e8d-d964-497d-8956-6ab4b35e8c79-operator-scripts\") pod \"nova-api-755e-account-create-update-bssdk\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.388569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wwdh\" (UniqueName: \"kubernetes.io/projected/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-kube-api-access-7wwdh\") pod \"nova-cell0-db-create-6k79j\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.388611 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a9a36da-9efd-4e81-bc24-925a880271da-operator-scripts\") pod \"nova-cell1-db-create-dp7ds\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.388649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nscdk\" (UniqueName: \"kubernetes.io/projected/6a9a36da-9efd-4e81-bc24-925a880271da-kube-api-access-nscdk\") pod \"nova-cell1-db-create-dp7ds\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.388831 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgfr\" (UniqueName: \"kubernetes.io/projected/75fe9e8d-d964-497d-8956-6ab4b35e8c79-kube-api-access-xhgfr\") pod \"nova-api-755e-account-create-update-bssdk\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.389413 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75fe9e8d-d964-497d-8956-6ab4b35e8c79-operator-scripts\") pod \"nova-api-755e-account-create-update-bssdk\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.390070 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-operator-scripts\") pod \"nova-cell0-db-create-6k79j\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.397764 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9453-account-create-update-d76hx"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.398780 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.403463 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.407773 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wwdh\" (UniqueName: \"kubernetes.io/projected/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-kube-api-access-7wwdh\") pod \"nova-cell0-db-create-6k79j\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.411228 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgfr\" (UniqueName: \"kubernetes.io/projected/75fe9e8d-d964-497d-8956-6ab4b35e8c79-kube-api-access-xhgfr\") pod \"nova-api-755e-account-create-update-bssdk\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.459596 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9453-account-create-update-d76hx"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.491273 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8c2\" (UniqueName: \"kubernetes.io/projected/095fe9ee-f061-44f7-9c27-3a8700a1be60-kube-api-access-tq8c2\") pod \"nova-cell0-9453-account-create-update-d76hx\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.491739 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fe9ee-f061-44f7-9c27-3a8700a1be60-operator-scripts\") pod \"nova-cell0-9453-account-create-update-d76hx\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.491791 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a9a36da-9efd-4e81-bc24-925a880271da-operator-scripts\") pod \"nova-cell1-db-create-dp7ds\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.491895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nscdk\" (UniqueName: \"kubernetes.io/projected/6a9a36da-9efd-4e81-bc24-925a880271da-kube-api-access-nscdk\") pod \"nova-cell1-db-create-dp7ds\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.492585 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a9a36da-9efd-4e81-bc24-925a880271da-operator-scripts\") pod \"nova-cell1-db-create-dp7ds\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.510711 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nscdk\" (UniqueName: \"kubernetes.io/projected/6a9a36da-9efd-4e81-bc24-925a880271da-kube-api-access-nscdk\") pod \"nova-cell1-db-create-dp7ds\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.515746 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.571636 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.593395 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8c2\" (UniqueName: \"kubernetes.io/projected/095fe9ee-f061-44f7-9c27-3a8700a1be60-kube-api-access-tq8c2\") pod \"nova-cell0-9453-account-create-update-d76hx\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.593529 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fe9ee-f061-44f7-9c27-3a8700a1be60-operator-scripts\") pod \"nova-cell0-9453-account-create-update-d76hx\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.594490 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fe9ee-f061-44f7-9c27-3a8700a1be60-operator-scripts\") pod \"nova-cell0-9453-account-create-update-d76hx\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.594489 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0efa-account-create-update-srvvf"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.595919 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.613705 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.614760 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.624127 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8c2\" (UniqueName: \"kubernetes.io/projected/095fe9ee-f061-44f7-9c27-3a8700a1be60-kube-api-access-tq8c2\") pod \"nova-cell0-9453-account-create-update-d76hx\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.634390 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0efa-account-create-update-srvvf"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.695264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-operator-scripts\") pod \"nova-cell1-0efa-account-create-update-srvvf\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.695423 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7m4\" (UniqueName: \"kubernetes.io/projected/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-kube-api-access-qx7m4\") pod \"nova-cell1-0efa-account-create-update-srvvf\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.789879 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.796626 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7m4\" (UniqueName: \"kubernetes.io/projected/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-kube-api-access-qx7m4\") pod \"nova-cell1-0efa-account-create-update-srvvf\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.796683 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-operator-scripts\") pod \"nova-cell1-0efa-account-create-update-srvvf\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.797564 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-operator-scripts\") pod \"nova-cell1-0efa-account-create-update-srvvf\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.818342 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vbtxw"] Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.829323 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7m4\" (UniqueName: \"kubernetes.io/projected/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-kube-api-access-qx7m4\") pod \"nova-cell1-0efa-account-create-update-srvvf\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: W0318 13:44:57.854181 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd136526d_eb62_4c8e_a26e_250933f5f0f4.slice/crio-d9aa975e04314a1c183500b641a5124654ea2db0db514a1e86f1592f8badbb08 WatchSource:0}: Error finding container d9aa975e04314a1c183500b641a5124654ea2db0db514a1e86f1592f8badbb08: Status 404 returned error can't find the container with id d9aa975e04314a1c183500b641a5124654ea2db0db514a1e86f1592f8badbb08 Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.939082 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:44:57 crc kubenswrapper[4921]: I0318 13:44:57.993140 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6k79j"] Mar 18 13:44:58 crc kubenswrapper[4921]: W0318 13:44:58.020685 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c218be7_6bd0_4e35_ad4c_4bc4aa635fae.slice/crio-20cd764a8e0eff6adf87c9918acf4e1d4462430ad7090b09d85c1dc4ed5070c2 WatchSource:0}: Error finding container 20cd764a8e0eff6adf87c9918acf4e1d4462430ad7090b09d85c1dc4ed5070c2: Status 404 returned error can't find the container with id 20cd764a8e0eff6adf87c9918acf4e1d4462430ad7090b09d85c1dc4ed5070c2 Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.148165 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dp7ds"] Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.218892 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-755e-account-create-update-bssdk"] Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.329504 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9453-account-create-update-d76hx"] Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.444900 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9453-account-create-update-d76hx" event={"ID":"095fe9ee-f061-44f7-9c27-3a8700a1be60","Type":"ContainerStarted","Data":"54d80db395188aa909c0a2f29addf360a47b0ba765fc9c996c7d29a1ae3b4354"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.447416 4921 generic.go:334] "Generic (PLEG): container finished" podID="d136526d-eb62-4c8e-a26e-250933f5f0f4" containerID="60b7b6e0491b94b05ff7418c3c7541c3a304585fce61af6c2aac8f77125ac3f9" exitCode=0 Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.447470 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbtxw" event={"ID":"d136526d-eb62-4c8e-a26e-250933f5f0f4","Type":"ContainerDied","Data":"60b7b6e0491b94b05ff7418c3c7541c3a304585fce61af6c2aac8f77125ac3f9"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.447530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbtxw" event={"ID":"d136526d-eb62-4c8e-a26e-250933f5f0f4","Type":"ContainerStarted","Data":"d9aa975e04314a1c183500b641a5124654ea2db0db514a1e86f1592f8badbb08"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.448906 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dp7ds" event={"ID":"6a9a36da-9efd-4e81-bc24-925a880271da","Type":"ContainerStarted","Data":"cd3e6222d0802885a8f92aec72c70960898edf2eef211d88a902a3a3608c4ac2"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.450486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-755e-account-create-update-bssdk" event={"ID":"75fe9e8d-d964-497d-8956-6ab4b35e8c79","Type":"ContainerStarted","Data":"093f9ec574b9ef32ac7a98ed73516d7ebba61f506986e23c1d8c45da47f5afdb"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.452334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6k79j" event={"ID":"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae","Type":"ContainerStarted","Data":"f50a155c80d235574493bae686b8564dfc263bb9683eacb2dccaa598e220a47d"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.452361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6k79j" event={"ID":"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae","Type":"ContainerStarted","Data":"20cd764a8e0eff6adf87c9918acf4e1d4462430ad7090b09d85c1dc4ed5070c2"} Mar 18 13:44:58 crc kubenswrapper[4921]: I0318 13:44:58.504535 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0efa-account-create-update-srvvf"] Mar 18 13:44:58 crc kubenswrapper[4921]: W0318 13:44:58.558136 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ad6c5a7_b256_4df6_81a2_4cf2dd7fdb0f.slice/crio-e4203856b8d9188235264ba606b7e5a8c0c5c96bb1aa869a4dd2af96f672eb69 WatchSource:0}: Error finding container e4203856b8d9188235264ba606b7e5a8c0c5c96bb1aa869a4dd2af96f672eb69: Status 404 returned error can't find the container with id e4203856b8d9188235264ba606b7e5a8c0c5c96bb1aa869a4dd2af96f672eb69 Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.463860 4921 generic.go:334] "Generic (PLEG): container finished" podID="095fe9ee-f061-44f7-9c27-3a8700a1be60" containerID="b6f99e564045e1757bd4ac3e9496ecc2001dddd60ef44395a4a8474715cf4c31" exitCode=0 Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.463929 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9453-account-create-update-d76hx" event={"ID":"095fe9ee-f061-44f7-9c27-3a8700a1be60","Type":"ContainerDied","Data":"b6f99e564045e1757bd4ac3e9496ecc2001dddd60ef44395a4a8474715cf4c31"} Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.467381 4921 generic.go:334] "Generic (PLEG): container finished" podID="6a9a36da-9efd-4e81-bc24-925a880271da" containerID="32676dd38ff52b4419048836061dfe80ae96dd1f80849f1da8dfed633da0fafc" exitCode=0 Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.467452 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dp7ds" event={"ID":"6a9a36da-9efd-4e81-bc24-925a880271da","Type":"ContainerDied","Data":"32676dd38ff52b4419048836061dfe80ae96dd1f80849f1da8dfed633da0fafc"} Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.469849 4921 generic.go:334] "Generic (PLEG): container finished" podID="75fe9e8d-d964-497d-8956-6ab4b35e8c79" containerID="7eb96d68e7e625f94612a5a361d373fd32b7ed7dc9e46b2609b473e18c645388" exitCode=0 Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.469923 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-755e-account-create-update-bssdk" event={"ID":"75fe9e8d-d964-497d-8956-6ab4b35e8c79","Type":"ContainerDied","Data":"7eb96d68e7e625f94612a5a361d373fd32b7ed7dc9e46b2609b473e18c645388"} Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.472247 4921 generic.go:334] "Generic (PLEG): container finished" podID="2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" containerID="f50a155c80d235574493bae686b8564dfc263bb9683eacb2dccaa598e220a47d" exitCode=0 Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.472347 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6k79j" event={"ID":"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae","Type":"ContainerDied","Data":"f50a155c80d235574493bae686b8564dfc263bb9683eacb2dccaa598e220a47d"} Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.473948 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" containerID="4175a6147f35c76e976564f86ceadd9ee21103301f94f37f9ddc12e8dbc54b2f" exitCode=0 Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.473990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" event={"ID":"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f","Type":"ContainerDied","Data":"4175a6147f35c76e976564f86ceadd9ee21103301f94f37f9ddc12e8dbc54b2f"} Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.474015 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" event={"ID":"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f","Type":"ContainerStarted","Data":"e4203856b8d9188235264ba606b7e5a8c0c5c96bb1aa869a4dd2af96f672eb69"} Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.900511 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:44:59 crc kubenswrapper[4921]: I0318 13:44:59.910536 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.041073 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wwdh\" (UniqueName: \"kubernetes.io/projected/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-kube-api-access-7wwdh\") pod \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.042158 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d136526d-eb62-4c8e-a26e-250933f5f0f4-operator-scripts\") pod \"d136526d-eb62-4c8e-a26e-250933f5f0f4\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.042258 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-operator-scripts\") pod \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\" (UID: \"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.042436 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpkj7\" (UniqueName: \"kubernetes.io/projected/d136526d-eb62-4c8e-a26e-250933f5f0f4-kube-api-access-gpkj7\") pod \"d136526d-eb62-4c8e-a26e-250933f5f0f4\" (UID: \"d136526d-eb62-4c8e-a26e-250933f5f0f4\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.042628 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d136526d-eb62-4c8e-a26e-250933f5f0f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d136526d-eb62-4c8e-a26e-250933f5f0f4" (UID: "d136526d-eb62-4c8e-a26e-250933f5f0f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.042702 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" (UID: "2c218be7-6bd0-4e35-ad4c-4bc4aa635fae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.043153 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d136526d-eb62-4c8e-a26e-250933f5f0f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.043173 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.046765 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-kube-api-access-7wwdh" (OuterVolumeSpecName: "kube-api-access-7wwdh") pod "2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" (UID: "2c218be7-6bd0-4e35-ad4c-4bc4aa635fae"). InnerVolumeSpecName "kube-api-access-7wwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.046805 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d136526d-eb62-4c8e-a26e-250933f5f0f4-kube-api-access-gpkj7" (OuterVolumeSpecName: "kube-api-access-gpkj7") pod "d136526d-eb62-4c8e-a26e-250933f5f0f4" (UID: "d136526d-eb62-4c8e-a26e-250933f5f0f4"). InnerVolumeSpecName "kube-api-access-gpkj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.144580 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq"] Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.144941 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpkj7\" (UniqueName: \"kubernetes.io/projected/d136526d-eb62-4c8e-a26e-250933f5f0f4-kube-api-access-gpkj7\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:00 crc kubenswrapper[4921]: E0318 13:45:00.144980 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" containerName="mariadb-database-create" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.144987 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wwdh\" (UniqueName: \"kubernetes.io/projected/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae-kube-api-access-7wwdh\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.144993 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" containerName="mariadb-database-create" Mar 18 13:45:00 crc kubenswrapper[4921]: E0318 13:45:00.145043 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d136526d-eb62-4c8e-a26e-250933f5f0f4" containerName="mariadb-database-create" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.145059 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d136526d-eb62-4c8e-a26e-250933f5f0f4" containerName="mariadb-database-create" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.146382 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d136526d-eb62-4c8e-a26e-250933f5f0f4" containerName="mariadb-database-create" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.146410 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" containerName="mariadb-database-create" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.147225 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.149607 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.149615 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.153724 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq"] Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.246669 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p42zb\" (UniqueName: \"kubernetes.io/projected/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-kube-api-access-p42zb\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.247322 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-config-volume\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.247564 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-secret-volume\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.349454 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p42zb\" (UniqueName: \"kubernetes.io/projected/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-kube-api-access-p42zb\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.349613 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-config-volume\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.349685 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-secret-volume\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.350718 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-config-volume\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.353470 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-secret-volume\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.371462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p42zb\" (UniqueName: \"kubernetes.io/projected/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-kube-api-access-p42zb\") pod \"collect-profiles-29564025-n6vtq\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.470534 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.491131 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6k79j" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.491133 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6k79j" event={"ID":"2c218be7-6bd0-4e35-ad4c-4bc4aa635fae","Type":"ContainerDied","Data":"20cd764a8e0eff6adf87c9918acf4e1d4462430ad7090b09d85c1dc4ed5070c2"} Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.491264 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20cd764a8e0eff6adf87c9918acf4e1d4462430ad7090b09d85c1dc4ed5070c2" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.493598 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vbtxw" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.497164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vbtxw" event={"ID":"d136526d-eb62-4c8e-a26e-250933f5f0f4","Type":"ContainerDied","Data":"d9aa975e04314a1c183500b641a5124654ea2db0db514a1e86f1592f8badbb08"} Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.497228 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9aa975e04314a1c183500b641a5124654ea2db0db514a1e86f1592f8badbb08" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.896210 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.902029 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.974697 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-operator-scripts\") pod \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.974771 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nscdk\" (UniqueName: \"kubernetes.io/projected/6a9a36da-9efd-4e81-bc24-925a880271da-kube-api-access-nscdk\") pod \"6a9a36da-9efd-4e81-bc24-925a880271da\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.974879 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx7m4\" (UniqueName: \"kubernetes.io/projected/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-kube-api-access-qx7m4\") pod \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\" (UID: \"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.974915 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a9a36da-9efd-4e81-bc24-925a880271da-operator-scripts\") pod \"6a9a36da-9efd-4e81-bc24-925a880271da\" (UID: \"6a9a36da-9efd-4e81-bc24-925a880271da\") " Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.975614 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" (UID: "1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.981032 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-kube-api-access-qx7m4" (OuterVolumeSpecName: "kube-api-access-qx7m4") pod "1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" (UID: "1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f"). InnerVolumeSpecName "kube-api-access-qx7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.987484 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a9a36da-9efd-4e81-bc24-925a880271da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a9a36da-9efd-4e81-bc24-925a880271da" (UID: "6a9a36da-9efd-4e81-bc24-925a880271da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:00 crc kubenswrapper[4921]: I0318 13:45:00.989982 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9a36da-9efd-4e81-bc24-925a880271da-kube-api-access-nscdk" (OuterVolumeSpecName: "kube-api-access-nscdk") pod "6a9a36da-9efd-4e81-bc24-925a880271da" (UID: "6a9a36da-9efd-4e81-bc24-925a880271da"). InnerVolumeSpecName "kube-api-access-nscdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.078220 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx7m4\" (UniqueName: \"kubernetes.io/projected/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-kube-api-access-qx7m4\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.078259 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a9a36da-9efd-4e81-bc24-925a880271da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.078268 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.078278 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nscdk\" (UniqueName: \"kubernetes.io/projected/6a9a36da-9efd-4e81-bc24-925a880271da-kube-api-access-nscdk\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.112572 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:45:01 crc kubenswrapper[4921]: W0318 13:45:01.204603 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dfcce58_9dd7_456d_a101_6b1d5ba04d54.slice/crio-2785fd3039341c8cfa5a929e9675c816f0bc00660c5848e5796b573b6ded8133 WatchSource:0}: Error finding container 2785fd3039341c8cfa5a929e9675c816f0bc00660c5848e5796b573b6ded8133: Status 404 returned error can't find the container with id 2785fd3039341c8cfa5a929e9675c816f0bc00660c5848e5796b573b6ded8133 Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.207910 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.243101 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq"] Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.309200 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75fe9e8d-d964-497d-8956-6ab4b35e8c79-operator-scripts\") pod \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.309696 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhgfr\" (UniqueName: \"kubernetes.io/projected/75fe9e8d-d964-497d-8956-6ab4b35e8c79-kube-api-access-xhgfr\") pod \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\" (UID: \"75fe9e8d-d964-497d-8956-6ab4b35e8c79\") " Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.310386 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75fe9e8d-d964-497d-8956-6ab4b35e8c79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75fe9e8d-d964-497d-8956-6ab4b35e8c79" (UID: "75fe9e8d-d964-497d-8956-6ab4b35e8c79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.315008 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75fe9e8d-d964-497d-8956-6ab4b35e8c79-kube-api-access-xhgfr" (OuterVolumeSpecName: "kube-api-access-xhgfr") pod "75fe9e8d-d964-497d-8956-6ab4b35e8c79" (UID: "75fe9e8d-d964-497d-8956-6ab4b35e8c79"). InnerVolumeSpecName "kube-api-access-xhgfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.411289 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fe9ee-f061-44f7-9c27-3a8700a1be60-operator-scripts\") pod \"095fe9ee-f061-44f7-9c27-3a8700a1be60\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.411384 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq8c2\" (UniqueName: \"kubernetes.io/projected/095fe9ee-f061-44f7-9c27-3a8700a1be60-kube-api-access-tq8c2\") pod \"095fe9ee-f061-44f7-9c27-3a8700a1be60\" (UID: \"095fe9ee-f061-44f7-9c27-3a8700a1be60\") " Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.411728 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095fe9ee-f061-44f7-9c27-3a8700a1be60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "095fe9ee-f061-44f7-9c27-3a8700a1be60" (UID: "095fe9ee-f061-44f7-9c27-3a8700a1be60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.412082 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhgfr\" (UniqueName: \"kubernetes.io/projected/75fe9e8d-d964-497d-8956-6ab4b35e8c79-kube-api-access-xhgfr\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.412120 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/095fe9ee-f061-44f7-9c27-3a8700a1be60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.412136 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75fe9e8d-d964-497d-8956-6ab4b35e8c79-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.415977 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095fe9ee-f061-44f7-9c27-3a8700a1be60-kube-api-access-tq8c2" (OuterVolumeSpecName: "kube-api-access-tq8c2") pod "095fe9ee-f061-44f7-9c27-3a8700a1be60" (UID: "095fe9ee-f061-44f7-9c27-3a8700a1be60"). InnerVolumeSpecName "kube-api-access-tq8c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.504190 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-755e-account-create-update-bssdk" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.504225 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-755e-account-create-update-bssdk" event={"ID":"75fe9e8d-d964-497d-8956-6ab4b35e8c79","Type":"ContainerDied","Data":"093f9ec574b9ef32ac7a98ed73516d7ebba61f506986e23c1d8c45da47f5afdb"} Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.505845 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="093f9ec574b9ef32ac7a98ed73516d7ebba61f506986e23c1d8c45da47f5afdb" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.507083 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" event={"ID":"1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f","Type":"ContainerDied","Data":"e4203856b8d9188235264ba606b7e5a8c0c5c96bb1aa869a4dd2af96f672eb69"} Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.507161 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4203856b8d9188235264ba606b7e5a8c0c5c96bb1aa869a4dd2af96f672eb69" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.507129 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0efa-account-create-update-srvvf" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.511551 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" event={"ID":"1dfcce58-9dd7-456d-a101-6b1d5ba04d54","Type":"ContainerStarted","Data":"0acccaae3270202bc784c753fc39b93c3c148301e6c00744c4fcc3006dd7fc20"} Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.511602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" event={"ID":"1dfcce58-9dd7-456d-a101-6b1d5ba04d54","Type":"ContainerStarted","Data":"2785fd3039341c8cfa5a929e9675c816f0bc00660c5848e5796b573b6ded8133"} Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.512954 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq8c2\" (UniqueName: \"kubernetes.io/projected/095fe9ee-f061-44f7-9c27-3a8700a1be60-kube-api-access-tq8c2\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.514897 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9453-account-create-update-d76hx" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.515256 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9453-account-create-update-d76hx" event={"ID":"095fe9ee-f061-44f7-9c27-3a8700a1be60","Type":"ContainerDied","Data":"54d80db395188aa909c0a2f29addf360a47b0ba765fc9c996c7d29a1ae3b4354"} Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.515303 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d80db395188aa909c0a2f29addf360a47b0ba765fc9c996c7d29a1ae3b4354" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.518573 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dp7ds" event={"ID":"6a9a36da-9efd-4e81-bc24-925a880271da","Type":"ContainerDied","Data":"cd3e6222d0802885a8f92aec72c70960898edf2eef211d88a902a3a3608c4ac2"} Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.518600 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd3e6222d0802885a8f92aec72c70960898edf2eef211d88a902a3a3608c4ac2" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.518657 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dp7ds" Mar 18 13:45:01 crc kubenswrapper[4921]: I0318 13:45:01.530369 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" podStartSLOduration=1.53035061 podStartE2EDuration="1.53035061s" podCreationTimestamp="2026-03-18 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:01.526784248 +0000 UTC m=+5721.076704887" watchObservedRunningTime="2026-03-18 13:45:01.53035061 +0000 UTC m=+5721.080271249" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.529099 4921 generic.go:334] "Generic (PLEG): container finished" podID="1dfcce58-9dd7-456d-a101-6b1d5ba04d54" containerID="0acccaae3270202bc784c753fc39b93c3c148301e6c00744c4fcc3006dd7fc20" exitCode=0 Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.529161 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" event={"ID":"1dfcce58-9dd7-456d-a101-6b1d5ba04d54","Type":"ContainerDied","Data":"0acccaae3270202bc784c753fc39b93c3c148301e6c00744c4fcc3006dd7fc20"} Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.588574 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-smltf"] Mar 18 13:45:02 crc kubenswrapper[4921]: E0318 13:45:02.589250 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589267 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: E0318 13:45:02.589283 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a36da-9efd-4e81-bc24-925a880271da" containerName="mariadb-database-create" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589289 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a36da-9efd-4e81-bc24-925a880271da" containerName="mariadb-database-create" Mar 18 13:45:02 crc kubenswrapper[4921]: E0318 13:45:02.589300 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75fe9e8d-d964-497d-8956-6ab4b35e8c79" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589306 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fe9e8d-d964-497d-8956-6ab4b35e8c79" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: E0318 13:45:02.589318 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095fe9ee-f061-44f7-9c27-3a8700a1be60" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589323 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="095fe9ee-f061-44f7-9c27-3a8700a1be60" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589490 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589507 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="095fe9ee-f061-44f7-9c27-3a8700a1be60" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589516 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9a36da-9efd-4e81-bc24-925a880271da" containerName="mariadb-database-create" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.589528 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="75fe9e8d-d964-497d-8956-6ab4b35e8c79" containerName="mariadb-account-create-update" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.590189 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.592687 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rjh7l" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.592954 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.594191 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.613600 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-smltf"] Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.734975 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjr6m\" (UniqueName: \"kubernetes.io/projected/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-kube-api-access-kjr6m\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.735541 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.735627 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-config-data\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.735700 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-scripts\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.836986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-scripts\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.837313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjr6m\" (UniqueName: \"kubernetes.io/projected/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-kube-api-access-kjr6m\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.837412 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.837533 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-config-data\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.843621 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-scripts\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.847330 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-config-data\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.852304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.858845 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjr6m\" (UniqueName: \"kubernetes.io/projected/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-kube-api-access-kjr6m\") pod \"nova-cell0-conductor-db-sync-smltf\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:02 crc kubenswrapper[4921]: I0318 13:45:02.904666 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:03 crc kubenswrapper[4921]: I0318 13:45:03.331945 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-smltf"] Mar 18 13:45:03 crc kubenswrapper[4921]: W0318 13:45:03.338819 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0fe554c_3d22_45c1_8f9f_b10b3d36ad36.slice/crio-88d84d2638f4a91eaba9afcc31d4e0b8a39cadac16a337e0239a3252087ee61b WatchSource:0}: Error finding container 88d84d2638f4a91eaba9afcc31d4e0b8a39cadac16a337e0239a3252087ee61b: Status 404 returned error can't find the container with id 88d84d2638f4a91eaba9afcc31d4e0b8a39cadac16a337e0239a3252087ee61b Mar 18 13:45:03 crc kubenswrapper[4921]: I0318 13:45:03.557818 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-smltf" event={"ID":"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36","Type":"ContainerStarted","Data":"deb5d22713ef5c5c4763ce0b3d2b60a1b681ca478566596c6af362549c7a984b"} Mar 18 13:45:03 crc kubenswrapper[4921]: I0318 13:45:03.558148 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-smltf" event={"ID":"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36","Type":"ContainerStarted","Data":"88d84d2638f4a91eaba9afcc31d4e0b8a39cadac16a337e0239a3252087ee61b"} Mar 18 13:45:03 crc kubenswrapper[4921]: I0318 13:45:03.575706 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-smltf" podStartSLOduration=1.575683181 podStartE2EDuration="1.575683181s" podCreationTimestamp="2026-03-18 13:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:03.570460032 +0000 UTC m=+5723.120380671" watchObservedRunningTime="2026-03-18 13:45:03.575683181 +0000 UTC m=+5723.125603830" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.445931 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.564938 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.564932 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq" event={"ID":"1dfcce58-9dd7-456d-a101-6b1d5ba04d54","Type":"ContainerDied","Data":"2785fd3039341c8cfa5a929e9675c816f0bc00660c5848e5796b573b6ded8133"} Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.565342 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2785fd3039341c8cfa5a929e9675c816f0bc00660c5848e5796b573b6ded8133" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.628792 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p42zb\" (UniqueName: \"kubernetes.io/projected/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-kube-api-access-p42zb\") pod \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.628839 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-config-volume\") pod \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.629349 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-secret-volume\") pod \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\" (UID: \"1dfcce58-9dd7-456d-a101-6b1d5ba04d54\") " Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.629535 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-config-volume" (OuterVolumeSpecName: "config-volume") pod "1dfcce58-9dd7-456d-a101-6b1d5ba04d54" (UID: "1dfcce58-9dd7-456d-a101-6b1d5ba04d54"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.629641 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.635121 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1dfcce58-9dd7-456d-a101-6b1d5ba04d54" (UID: "1dfcce58-9dd7-456d-a101-6b1d5ba04d54"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.641765 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-kube-api-access-p42zb" (OuterVolumeSpecName: "kube-api-access-p42zb") pod "1dfcce58-9dd7-456d-a101-6b1d5ba04d54" (UID: "1dfcce58-9dd7-456d-a101-6b1d5ba04d54"). InnerVolumeSpecName "kube-api-access-p42zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.730951 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:04 crc kubenswrapper[4921]: I0318 13:45:04.730997 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p42zb\" (UniqueName: \"kubernetes.io/projected/1dfcce58-9dd7-456d-a101-6b1d5ba04d54-kube-api-access-p42zb\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:05 crc kubenswrapper[4921]: I0318 13:45:05.517943 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf"] Mar 18 13:45:05 crc kubenswrapper[4921]: I0318 13:45:05.526018 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-wgngf"] Mar 18 13:45:06 crc kubenswrapper[4921]: I0318 13:45:06.209370 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:45:06 crc kubenswrapper[4921]: E0318 13:45:06.209592 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:45:07 crc kubenswrapper[4921]: I0318 13:45:07.221887 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e2501f-5413-44f3-beba-c05c5cd108ab" path="/var/lib/kubelet/pods/e7e2501f-5413-44f3-beba-c05c5cd108ab/volumes" Mar 18 13:45:08 crc kubenswrapper[4921]: I0318 13:45:08.550510 4921 scope.go:117] "RemoveContainer" containerID="2b763669d19e3e400b68fdce97ae2c185f2aee49c7f398fb45f1f875b0f3bd8a" Mar 18 13:45:08 crc kubenswrapper[4921]: I0318 13:45:08.573069 4921 scope.go:117] "RemoveContainer" containerID="056d83686f459b812f163ff110c22948adda33a51c92014cd85737a84fe4c4f1" Mar 18 13:45:08 crc kubenswrapper[4921]: I0318 13:45:08.606157 4921 scope.go:117] "RemoveContainer" containerID="32ca28282b9917bb375c16f381c375a9ba77e40bafe6ccce322a0bb6a6a86e75" Mar 18 13:45:09 crc kubenswrapper[4921]: I0318 13:45:09.609386 4921 generic.go:334] "Generic (PLEG): container finished" podID="b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" containerID="deb5d22713ef5c5c4763ce0b3d2b60a1b681ca478566596c6af362549c7a984b" exitCode=0 Mar 18 13:45:09 crc kubenswrapper[4921]: I0318 13:45:09.609498 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-smltf" event={"ID":"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36","Type":"ContainerDied","Data":"deb5d22713ef5c5c4763ce0b3d2b60a1b681ca478566596c6af362549c7a984b"} Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.002864 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.055848 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-config-data\") pod \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.055927 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-scripts\") pod \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.056046 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-combined-ca-bundle\") pod \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.056088 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjr6m\" (UniqueName: \"kubernetes.io/projected/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-kube-api-access-kjr6m\") pod \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\" (UID: \"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36\") " Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.065358 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-scripts" (OuterVolumeSpecName: "scripts") pod "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" (UID: "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.065476 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-kube-api-access-kjr6m" (OuterVolumeSpecName: "kube-api-access-kjr6m") pod "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" (UID: "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36"). InnerVolumeSpecName "kube-api-access-kjr6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.080555 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-config-data" (OuterVolumeSpecName: "config-data") pod "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" (UID: "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.102328 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" (UID: "b0fe554c-3d22-45c1-8f9f-b10b3d36ad36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.158411 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.158452 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjr6m\" (UniqueName: \"kubernetes.io/projected/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-kube-api-access-kjr6m\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.158468 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.158480 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.631832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-smltf" event={"ID":"b0fe554c-3d22-45c1-8f9f-b10b3d36ad36","Type":"ContainerDied","Data":"88d84d2638f4a91eaba9afcc31d4e0b8a39cadac16a337e0239a3252087ee61b"} Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.632260 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88d84d2638f4a91eaba9afcc31d4e0b8a39cadac16a337e0239a3252087ee61b" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.632000 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-smltf" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.699649 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:45:11 crc kubenswrapper[4921]: E0318 13:45:11.700092 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" containerName="nova-cell0-conductor-db-sync" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.700130 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" containerName="nova-cell0-conductor-db-sync" Mar 18 13:45:11 crc kubenswrapper[4921]: E0318 13:45:11.700147 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfcce58-9dd7-456d-a101-6b1d5ba04d54" containerName="collect-profiles" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.700154 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfcce58-9dd7-456d-a101-6b1d5ba04d54" containerName="collect-profiles" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.700360 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" containerName="nova-cell0-conductor-db-sync" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.700402 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfcce58-9dd7-456d-a101-6b1d5ba04d54" containerName="collect-profiles" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.701092 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.703651 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.704297 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rjh7l" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.712238 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.770772 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4g9t\" (UniqueName: \"kubernetes.io/projected/396d2df9-ddb7-4514-a6e2-991b6c410448-kube-api-access-p4g9t\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.770852 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.770927 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.872909 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4g9t\" (UniqueName: \"kubernetes.io/projected/396d2df9-ddb7-4514-a6e2-991b6c410448-kube-api-access-p4g9t\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.872982 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.873028 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.878826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.880470 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:11 crc kubenswrapper[4921]: I0318 13:45:11.890337 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4g9t\" (UniqueName: \"kubernetes.io/projected/396d2df9-ddb7-4514-a6e2-991b6c410448-kube-api-access-p4g9t\") pod \"nova-cell0-conductor-0\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:12 crc kubenswrapper[4921]: I0318 13:45:12.018086 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:12 crc kubenswrapper[4921]: I0318 13:45:12.429120 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:45:12 crc kubenswrapper[4921]: I0318 13:45:12.644757 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"396d2df9-ddb7-4514-a6e2-991b6c410448","Type":"ContainerStarted","Data":"9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6"} Mar 18 13:45:12 crc kubenswrapper[4921]: I0318 13:45:12.645343 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"396d2df9-ddb7-4514-a6e2-991b6c410448","Type":"ContainerStarted","Data":"f9ed5bdd44066d7d635463570e006058304ce451ebed562ee6b5140afe4e3511"} Mar 18 13:45:12 crc kubenswrapper[4921]: I0318 13:45:12.645605 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:12 crc kubenswrapper[4921]: I0318 13:45:12.665508 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.665484921 podStartE2EDuration="1.665484921s" podCreationTimestamp="2026-03-18 13:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:12.658227054 +0000 UTC m=+5732.208147703" watchObservedRunningTime="2026-03-18 13:45:12.665484921 +0000 UTC m=+5732.215405560" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.044042 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.503547 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xgv99"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.504625 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.507582 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.522380 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.524094 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xgv99"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.630438 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.631478 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.640390 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.644631 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.671510 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.671805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-scripts\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.671949 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-config-data\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.672028 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltw8\" (UniqueName: \"kubernetes.io/projected/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-kube-api-access-kltw8\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.776546 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltw8\" (UniqueName: \"kubernetes.io/projected/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-kube-api-access-kltw8\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.777182 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqtw7\" (UniqueName: \"kubernetes.io/projected/df87e599-31a3-47be-b740-25701e5e5ca8-kube-api-access-cqtw7\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.777288 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-config-data\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.777415 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.777588 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-scripts\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.777712 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.777899 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-config-data\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.780985 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.785377 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.786988 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.787760 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-scripts\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.797337 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.801196 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.806465 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltw8\" (UniqueName: \"kubernetes.io/projected/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-kube-api-access-kltw8\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.806738 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-config-data\") pod \"nova-cell0-cell-mapping-xgv99\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.846507 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.848561 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.849193 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.853535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.883445 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqtw7\" (UniqueName: \"kubernetes.io/projected/df87e599-31a3-47be-b740-25701e5e5ca8-kube-api-access-cqtw7\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.883505 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-config-data\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.883616 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.895556 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-config-data\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.896715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.902506 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.978310 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqtw7\" (UniqueName: \"kubernetes.io/projected/df87e599-31a3-47be-b740-25701e5e5ca8-kube-api-access-cqtw7\") pod \"nova-scheduler-0\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.985889 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.985955 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.985973 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-config-data\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.986035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.986075 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmxwq\" (UniqueName: \"kubernetes.io/projected/60ece332-11ab-4c10-93c1-587177ab25cc-kube-api-access-cmxwq\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.986096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60ece332-11ab-4c10-93c1-587177ab25cc-logs\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.986130 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrrzg\" (UniqueName: \"kubernetes.io/projected/0d7160e1-8c76-4e45-b7b9-68556e95db42-kube-api-access-jrrzg\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.986231 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.987742 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.989612 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:45:17 crc kubenswrapper[4921]: I0318 13:45:17.999643 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-nvscx"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.001740 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.050745 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.067270 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-nvscx"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.087606 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.087679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-config-data\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.087777 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.087799 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-config-data\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088600 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820c8c37-f45a-465e-bdfb-914e9dd5e209-logs\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088642 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55cn6\" (UniqueName: \"kubernetes.io/projected/820c8c37-f45a-465e-bdfb-914e9dd5e209-kube-api-access-55cn6\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088689 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088742 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmxwq\" (UniqueName: \"kubernetes.io/projected/60ece332-11ab-4c10-93c1-587177ab25cc-kube-api-access-cmxwq\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088764 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60ece332-11ab-4c10-93c1-587177ab25cc-logs\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.088782 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrrzg\" (UniqueName: \"kubernetes.io/projected/0d7160e1-8c76-4e45-b7b9-68556e95db42-kube-api-access-jrrzg\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.091217 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60ece332-11ab-4c10-93c1-587177ab25cc-logs\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.098975 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-config-data\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.099661 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.100190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.100861 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.114393 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrrzg\" (UniqueName: \"kubernetes.io/projected/0d7160e1-8c76-4e45-b7b9-68556e95db42-kube-api-access-jrrzg\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.124893 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmxwq\" (UniqueName: \"kubernetes.io/projected/60ece332-11ab-4c10-93c1-587177ab25cc-kube-api-access-cmxwq\") pod \"nova-metadata-0\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191418 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-config-data\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191498 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-dns-svc\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-config\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820c8c37-f45a-465e-bdfb-914e9dd5e209-logs\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191601 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55cn6\" (UniqueName: \"kubernetes.io/projected/820c8c37-f45a-465e-bdfb-914e9dd5e209-kube-api-access-55cn6\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191641 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-nb\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191727 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9nl\" (UniqueName: \"kubernetes.io/projected/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-kube-api-access-zf9nl\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.191759 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-sb\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.193425 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820c8c37-f45a-465e-bdfb-914e9dd5e209-logs\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.195959 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-config-data\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.201908 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.212283 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55cn6\" (UniqueName: \"kubernetes.io/projected/820c8c37-f45a-465e-bdfb-914e9dd5e209-kube-api-access-55cn6\") pod \"nova-api-0\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.256987 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.294015 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-config\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.294173 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-nb\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.294225 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9nl\" (UniqueName: \"kubernetes.io/projected/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-kube-api-access-zf9nl\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.294247 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-sb\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.294303 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-dns-svc\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.295571 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-dns-svc\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.296371 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-config\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.296975 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-nb\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.297961 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-sb\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.312813 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.319617 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9nl\" (UniqueName: \"kubernetes.io/projected/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-kube-api-access-zf9nl\") pod \"dnsmasq-dns-69dc7db885-nvscx\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.327346 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.357993 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.368443 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.567516 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xgv99"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.610308 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkfpg"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.612452 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.616439 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.616491 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.651085 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkfpg"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.702023 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xgv99" event={"ID":"83fb77e2-32ca-40d1-a597-47c8ebe2a41a","Type":"ContainerStarted","Data":"179e2b4d6294e1fd97d5c55e2070de1ae10b38ff1dd8581a647fdf03ddd2deb3"} Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.804455 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-config-data\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.804640 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxlr\" (UniqueName: \"kubernetes.io/projected/561064ed-df57-419d-8d92-3837da58564b-kube-api-access-fsxlr\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.804689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.804730 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-scripts\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.860374 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.905803 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxlr\" (UniqueName: \"kubernetes.io/projected/561064ed-df57-419d-8d92-3837da58564b-kube-api-access-fsxlr\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.905859 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.905892 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-scripts\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.905955 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-config-data\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.909849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-scripts\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.910305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-config-data\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.910496 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.924682 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxlr\" (UniqueName: \"kubernetes.io/projected/561064ed-df57-419d-8d92-3837da58564b-kube-api-access-fsxlr\") pod \"nova-cell1-conductor-db-sync-tkfpg\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:18 crc kubenswrapper[4921]: I0318 13:45:18.958412 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.013898 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.164955 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.181395 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.201187 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-nvscx"] Mar 18 13:45:19 crc kubenswrapper[4921]: W0318 13:45:19.209559 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4db0a9e2_b13b_44be_bfb3_68c6590e81b4.slice/crio-1a7b488aa60f0ed77354f0403ca06c545fdac7a1b71ad4657e82932c4ad1ec2c WatchSource:0}: Error finding container 1a7b488aa60f0ed77354f0403ca06c545fdac7a1b71ad4657e82932c4ad1ec2c: Status 404 returned error can't find the container with id 1a7b488aa60f0ed77354f0403ca06c545fdac7a1b71ad4657e82932c4ad1ec2c Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.510303 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkfpg"] Mar 18 13:45:19 crc kubenswrapper[4921]: W0318 13:45:19.551716 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod561064ed_df57_419d_8d92_3837da58564b.slice/crio-7597201254dad929eb9349416916a86da32829da6b82e5b0e37844f558943a90 WatchSource:0}: Error finding container 7597201254dad929eb9349416916a86da32829da6b82e5b0e37844f558943a90: Status 404 returned error can't find the container with id 7597201254dad929eb9349416916a86da32829da6b82e5b0e37844f558943a90 Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.714506 4921 generic.go:334] "Generic (PLEG): container finished" podID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerID="76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7" exitCode=0 Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.714555 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" event={"ID":"4db0a9e2-b13b-44be-bfb3-68c6590e81b4","Type":"ContainerDied","Data":"76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.714953 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" event={"ID":"4db0a9e2-b13b-44be-bfb3-68c6590e81b4","Type":"ContainerStarted","Data":"1a7b488aa60f0ed77354f0403ca06c545fdac7a1b71ad4657e82932c4ad1ec2c"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.724566 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"820c8c37-f45a-465e-bdfb-914e9dd5e209","Type":"ContainerStarted","Data":"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.724626 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"820c8c37-f45a-465e-bdfb-914e9dd5e209","Type":"ContainerStarted","Data":"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.724641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"820c8c37-f45a-465e-bdfb-914e9dd5e209","Type":"ContainerStarted","Data":"1f2585162533f465613cea70a254fea1c5e5a6e15af580a351af80b6eb4a6d09"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.729164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"df87e599-31a3-47be-b740-25701e5e5ca8","Type":"ContainerStarted","Data":"a786fd4d469c04a8d6fb60df4c5d28075b32545d415f56854c6f984b0a09fe53"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.729210 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"df87e599-31a3-47be-b740-25701e5e5ca8","Type":"ContainerStarted","Data":"16b76e5869d1129b1311086a06a16fbd4e71c42898b69868c583d0895c33066e"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.731294 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d7160e1-8c76-4e45-b7b9-68556e95db42","Type":"ContainerStarted","Data":"8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.731327 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d7160e1-8c76-4e45-b7b9-68556e95db42","Type":"ContainerStarted","Data":"778a49e7bd88d4e0c715025b9e9ef536988b3ccf37cc166299492bb2a6c42011"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.737142 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xgv99" event={"ID":"83fb77e2-32ca-40d1-a597-47c8ebe2a41a","Type":"ContainerStarted","Data":"e97220eb3c36c88dd7c95e7b0b6347c9f9d43ed9936de0f523fea1693b0a7006"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.752662 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" event={"ID":"561064ed-df57-419d-8d92-3837da58564b","Type":"ContainerStarted","Data":"7597201254dad929eb9349416916a86da32829da6b82e5b0e37844f558943a90"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.761095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60ece332-11ab-4c10-93c1-587177ab25cc","Type":"ContainerStarted","Data":"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.761197 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60ece332-11ab-4c10-93c1-587177ab25cc","Type":"ContainerStarted","Data":"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.761211 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60ece332-11ab-4c10-93c1-587177ab25cc","Type":"ContainerStarted","Data":"f0bdd3b8c82c74bbfa882094d4aa470e5531acbf50131e692c45fc245e7342d9"} Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.776983 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.776952332 podStartE2EDuration="2.776952332s" podCreationTimestamp="2026-03-18 13:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:19.766976657 +0000 UTC m=+5739.316897306" watchObservedRunningTime="2026-03-18 13:45:19.776952332 +0000 UTC m=+5739.326872971" Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.786785 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xgv99" podStartSLOduration=2.786769572 podStartE2EDuration="2.786769572s" podCreationTimestamp="2026-03-18 13:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:19.785220968 +0000 UTC m=+5739.335141617" watchObservedRunningTime="2026-03-18 13:45:19.786769572 +0000 UTC m=+5739.336690211" Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.818302 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.818275232 podStartE2EDuration="2.818275232s" podCreationTimestamp="2026-03-18 13:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:19.80033696 +0000 UTC m=+5739.350257599" watchObservedRunningTime="2026-03-18 13:45:19.818275232 +0000 UTC m=+5739.368195871" Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.847068 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.847047813 podStartE2EDuration="2.847047813s" podCreationTimestamp="2026-03-18 13:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:19.846399075 +0000 UTC m=+5739.396319714" watchObservedRunningTime="2026-03-18 13:45:19.847047813 +0000 UTC m=+5739.396968452" Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.875799 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.875774073 podStartE2EDuration="2.875774073s" podCreationTimestamp="2026-03-18 13:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:19.870993487 +0000 UTC m=+5739.420914126" watchObservedRunningTime="2026-03-18 13:45:19.875774073 +0000 UTC m=+5739.425694712" Mar 18 13:45:19 crc kubenswrapper[4921]: I0318 13:45:19.895230 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" podStartSLOduration=1.895204418 podStartE2EDuration="1.895204418s" podCreationTimestamp="2026-03-18 13:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:19.893303024 +0000 UTC m=+5739.443223653" watchObservedRunningTime="2026-03-18 13:45:19.895204418 +0000 UTC m=+5739.445125057" Mar 18 13:45:20 crc kubenswrapper[4921]: I0318 13:45:20.772022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" event={"ID":"561064ed-df57-419d-8d92-3837da58564b","Type":"ContainerStarted","Data":"bc93348d1519329bb49846c74df5daf0078783ff1a6ba57f4fd453e053c4e7b0"} Mar 18 13:45:20 crc kubenswrapper[4921]: I0318 13:45:20.775332 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" event={"ID":"4db0a9e2-b13b-44be-bfb3-68c6590e81b4","Type":"ContainerStarted","Data":"c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58"} Mar 18 13:45:20 crc kubenswrapper[4921]: I0318 13:45:20.807380 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" podStartSLOduration=3.807358848 podStartE2EDuration="3.807358848s" podCreationTimestamp="2026-03-18 13:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:20.798071673 +0000 UTC m=+5740.347992322" watchObservedRunningTime="2026-03-18 13:45:20.807358848 +0000 UTC m=+5740.357279497" Mar 18 13:45:21 crc kubenswrapper[4921]: I0318 13:45:21.222050 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:45:21 crc kubenswrapper[4921]: E0318 13:45:21.222324 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:45:21 crc kubenswrapper[4921]: I0318 13:45:21.783157 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:23 crc kubenswrapper[4921]: I0318 13:45:23.257976 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:45:23 crc kubenswrapper[4921]: I0318 13:45:23.327531 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:23 crc kubenswrapper[4921]: I0318 13:45:23.805536 4921 generic.go:334] "Generic (PLEG): container finished" podID="561064ed-df57-419d-8d92-3837da58564b" containerID="bc93348d1519329bb49846c74df5daf0078783ff1a6ba57f4fd453e053c4e7b0" exitCode=0 Mar 18 13:45:23 crc kubenswrapper[4921]: I0318 13:45:23.805595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" event={"ID":"561064ed-df57-419d-8d92-3837da58564b","Type":"ContainerDied","Data":"bc93348d1519329bb49846c74df5daf0078783ff1a6ba57f4fd453e053c4e7b0"} Mar 18 13:45:24 crc kubenswrapper[4921]: I0318 13:45:24.815578 4921 generic.go:334] "Generic (PLEG): container finished" podID="83fb77e2-32ca-40d1-a597-47c8ebe2a41a" containerID="e97220eb3c36c88dd7c95e7b0b6347c9f9d43ed9936de0f523fea1693b0a7006" exitCode=0 Mar 18 13:45:24 crc kubenswrapper[4921]: I0318 13:45:24.815634 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xgv99" event={"ID":"83fb77e2-32ca-40d1-a597-47c8ebe2a41a","Type":"ContainerDied","Data":"e97220eb3c36c88dd7c95e7b0b6347c9f9d43ed9936de0f523fea1693b0a7006"} Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.247346 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.447813 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-config-data\") pod \"561064ed-df57-419d-8d92-3837da58564b\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.448155 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsxlr\" (UniqueName: \"kubernetes.io/projected/561064ed-df57-419d-8d92-3837da58564b-kube-api-access-fsxlr\") pod \"561064ed-df57-419d-8d92-3837da58564b\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.448181 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-combined-ca-bundle\") pod \"561064ed-df57-419d-8d92-3837da58564b\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.448239 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-scripts\") pod \"561064ed-df57-419d-8d92-3837da58564b\" (UID: \"561064ed-df57-419d-8d92-3837da58564b\") " Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.452688 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-scripts" (OuterVolumeSpecName: "scripts") pod "561064ed-df57-419d-8d92-3837da58564b" (UID: "561064ed-df57-419d-8d92-3837da58564b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.452865 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561064ed-df57-419d-8d92-3837da58564b-kube-api-access-fsxlr" (OuterVolumeSpecName: "kube-api-access-fsxlr") pod "561064ed-df57-419d-8d92-3837da58564b" (UID: "561064ed-df57-419d-8d92-3837da58564b"). InnerVolumeSpecName "kube-api-access-fsxlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.472631 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "561064ed-df57-419d-8d92-3837da58564b" (UID: "561064ed-df57-419d-8d92-3837da58564b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.475228 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-config-data" (OuterVolumeSpecName: "config-data") pod "561064ed-df57-419d-8d92-3837da58564b" (UID: "561064ed-df57-419d-8d92-3837da58564b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.550210 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.550258 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsxlr\" (UniqueName: \"kubernetes.io/projected/561064ed-df57-419d-8d92-3837da58564b-kube-api-access-fsxlr\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.550273 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.550284 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561064ed-df57-419d-8d92-3837da58564b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.827630 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.827621 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkfpg" event={"ID":"561064ed-df57-419d-8d92-3837da58564b","Type":"ContainerDied","Data":"7597201254dad929eb9349416916a86da32829da6b82e5b0e37844f558943a90"} Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.827781 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7597201254dad929eb9349416916a86da32829da6b82e5b0e37844f558943a90" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.931903 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:45:25 crc kubenswrapper[4921]: E0318 13:45:25.932480 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561064ed-df57-419d-8d92-3837da58564b" containerName="nova-cell1-conductor-db-sync" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.932499 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="561064ed-df57-419d-8d92-3837da58564b" containerName="nova-cell1-conductor-db-sync" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.932739 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="561064ed-df57-419d-8d92-3837da58564b" containerName="nova-cell1-conductor-db-sync" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.935601 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.939522 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.947241 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.957465 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.957606 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:25 crc kubenswrapper[4921]: I0318 13:45:25.957635 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnk76\" (UniqueName: \"kubernetes.io/projected/f55371a6-079f-4c8b-8460-b330cdc72ff6-kube-api-access-qnk76\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.064302 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.064352 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnk76\" (UniqueName: \"kubernetes.io/projected/f55371a6-079f-4c8b-8460-b330cdc72ff6-kube-api-access-qnk76\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.064481 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.069753 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.072072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.080435 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnk76\" (UniqueName: \"kubernetes.io/projected/f55371a6-079f-4c8b-8460-b330cdc72ff6-kube-api-access-qnk76\") pod \"nova-cell1-conductor-0\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.152921 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.168202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-combined-ca-bundle\") pod \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.168281 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-config-data\") pod \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.168867 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kltw8\" (UniqueName: \"kubernetes.io/projected/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-kube-api-access-kltw8\") pod \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.168902 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-scripts\") pod \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\" (UID: \"83fb77e2-32ca-40d1-a597-47c8ebe2a41a\") " Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.172395 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-scripts" (OuterVolumeSpecName: "scripts") pod "83fb77e2-32ca-40d1-a597-47c8ebe2a41a" (UID: "83fb77e2-32ca-40d1-a597-47c8ebe2a41a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.172527 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-kube-api-access-kltw8" (OuterVolumeSpecName: "kube-api-access-kltw8") pod "83fb77e2-32ca-40d1-a597-47c8ebe2a41a" (UID: "83fb77e2-32ca-40d1-a597-47c8ebe2a41a"). InnerVolumeSpecName "kube-api-access-kltw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.175336 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kltw8\" (UniqueName: \"kubernetes.io/projected/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-kube-api-access-kltw8\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.175365 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.195260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83fb77e2-32ca-40d1-a597-47c8ebe2a41a" (UID: "83fb77e2-32ca-40d1-a597-47c8ebe2a41a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.211234 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-config-data" (OuterVolumeSpecName: "config-data") pod "83fb77e2-32ca-40d1-a597-47c8ebe2a41a" (UID: "83fb77e2-32ca-40d1-a597-47c8ebe2a41a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.253945 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.277007 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.277048 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fb77e2-32ca-40d1-a597-47c8ebe2a41a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.682727 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.836552 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xgv99" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.836589 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xgv99" event={"ID":"83fb77e2-32ca-40d1-a597-47c8ebe2a41a","Type":"ContainerDied","Data":"179e2b4d6294e1fd97d5c55e2070de1ae10b38ff1dd8581a647fdf03ddd2deb3"} Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.836926 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="179e2b4d6294e1fd97d5c55e2070de1ae10b38ff1dd8581a647fdf03ddd2deb3" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.839386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f55371a6-079f-4c8b-8460-b330cdc72ff6","Type":"ContainerStarted","Data":"bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b"} Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.839414 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f55371a6-079f-4c8b-8460-b330cdc72ff6","Type":"ContainerStarted","Data":"be016592d57e77d23cc34c8625a049106cce3325f9edce4c2261a7c297b925ba"} Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.839563 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:26 crc kubenswrapper[4921]: I0318 13:45:26.871638 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.871614594 podStartE2EDuration="1.871614594s" podCreationTimestamp="2026-03-18 13:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:26.854762242 +0000 UTC m=+5746.404682871" watchObservedRunningTime="2026-03-18 13:45:26.871614594 +0000 UTC m=+5746.421535233" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.044703 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.045041 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="df87e599-31a3-47be-b740-25701e5e5ca8" containerName="nova-scheduler-scheduler" containerID="cri-o://a786fd4d469c04a8d6fb60df4c5d28075b32545d415f56854c6f984b0a09fe53" gracePeriod=30 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.056903 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.057146 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-log" containerID="cri-o://65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7" gracePeriod=30 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.057278 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-api" containerID="cri-o://da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758" gracePeriod=30 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.099375 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.099626 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-log" containerID="cri-o://7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10" gracePeriod=30 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.099752 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-metadata" containerID="cri-o://cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce" gracePeriod=30 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.661329 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.701604 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55cn6\" (UniqueName: \"kubernetes.io/projected/820c8c37-f45a-465e-bdfb-914e9dd5e209-kube-api-access-55cn6\") pod \"820c8c37-f45a-465e-bdfb-914e9dd5e209\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.701676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-combined-ca-bundle\") pod \"820c8c37-f45a-465e-bdfb-914e9dd5e209\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.701701 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-config-data\") pod \"820c8c37-f45a-465e-bdfb-914e9dd5e209\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.701827 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820c8c37-f45a-465e-bdfb-914e9dd5e209-logs\") pod \"820c8c37-f45a-465e-bdfb-914e9dd5e209\" (UID: \"820c8c37-f45a-465e-bdfb-914e9dd5e209\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.702794 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820c8c37-f45a-465e-bdfb-914e9dd5e209-logs" (OuterVolumeSpecName: "logs") pod "820c8c37-f45a-465e-bdfb-914e9dd5e209" (UID: "820c8c37-f45a-465e-bdfb-914e9dd5e209"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.708282 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820c8c37-f45a-465e-bdfb-914e9dd5e209-kube-api-access-55cn6" (OuterVolumeSpecName: "kube-api-access-55cn6") pod "820c8c37-f45a-465e-bdfb-914e9dd5e209" (UID: "820c8c37-f45a-465e-bdfb-914e9dd5e209"). InnerVolumeSpecName "kube-api-access-55cn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.728906 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.740714 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "820c8c37-f45a-465e-bdfb-914e9dd5e209" (UID: "820c8c37-f45a-465e-bdfb-914e9dd5e209"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.765551 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-config-data" (OuterVolumeSpecName: "config-data") pod "820c8c37-f45a-465e-bdfb-914e9dd5e209" (UID: "820c8c37-f45a-465e-bdfb-914e9dd5e209"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.803363 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-combined-ca-bundle\") pod \"60ece332-11ab-4c10-93c1-587177ab25cc\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.803683 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60ece332-11ab-4c10-93c1-587177ab25cc-logs\") pod \"60ece332-11ab-4c10-93c1-587177ab25cc\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.803813 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmxwq\" (UniqueName: \"kubernetes.io/projected/60ece332-11ab-4c10-93c1-587177ab25cc-kube-api-access-cmxwq\") pod \"60ece332-11ab-4c10-93c1-587177ab25cc\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.803945 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-config-data\") pod \"60ece332-11ab-4c10-93c1-587177ab25cc\" (UID: \"60ece332-11ab-4c10-93c1-587177ab25cc\") " Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.804163 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60ece332-11ab-4c10-93c1-587177ab25cc-logs" (OuterVolumeSpecName: "logs") pod "60ece332-11ab-4c10-93c1-587177ab25cc" (UID: "60ece332-11ab-4c10-93c1-587177ab25cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.804529 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60ece332-11ab-4c10-93c1-587177ab25cc-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.804612 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820c8c37-f45a-465e-bdfb-914e9dd5e209-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.804689 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55cn6\" (UniqueName: \"kubernetes.io/projected/820c8c37-f45a-465e-bdfb-914e9dd5e209-kube-api-access-55cn6\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.804765 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.804847 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820c8c37-f45a-465e-bdfb-914e9dd5e209-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.807178 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ece332-11ab-4c10-93c1-587177ab25cc-kube-api-access-cmxwq" (OuterVolumeSpecName: "kube-api-access-cmxwq") pod "60ece332-11ab-4c10-93c1-587177ab25cc" (UID: "60ece332-11ab-4c10-93c1-587177ab25cc"). InnerVolumeSpecName "kube-api-access-cmxwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.833606 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60ece332-11ab-4c10-93c1-587177ab25cc" (UID: "60ece332-11ab-4c10-93c1-587177ab25cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.833703 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-config-data" (OuterVolumeSpecName: "config-data") pod "60ece332-11ab-4c10-93c1-587177ab25cc" (UID: "60ece332-11ab-4c10-93c1-587177ab25cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.856675 4921 generic.go:334] "Generic (PLEG): container finished" podID="60ece332-11ab-4c10-93c1-587177ab25cc" containerID="cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce" exitCode=0 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.856741 4921 generic.go:334] "Generic (PLEG): container finished" podID="60ece332-11ab-4c10-93c1-587177ab25cc" containerID="7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10" exitCode=143 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.856927 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.857464 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60ece332-11ab-4c10-93c1-587177ab25cc","Type":"ContainerDied","Data":"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce"} Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.857538 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60ece332-11ab-4c10-93c1-587177ab25cc","Type":"ContainerDied","Data":"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10"} Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.857553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60ece332-11ab-4c10-93c1-587177ab25cc","Type":"ContainerDied","Data":"f0bdd3b8c82c74bbfa882094d4aa470e5531acbf50131e692c45fc245e7342d9"} Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.857596 4921 scope.go:117] "RemoveContainer" containerID="cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.878625 4921 generic.go:334] "Generic (PLEG): container finished" podID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerID="da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758" exitCode=0 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.878654 4921 generic.go:334] "Generic (PLEG): container finished" podID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerID="65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7" exitCode=143 Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.878697 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.878741 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"820c8c37-f45a-465e-bdfb-914e9dd5e209","Type":"ContainerDied","Data":"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758"} Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.878769 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"820c8c37-f45a-465e-bdfb-914e9dd5e209","Type":"ContainerDied","Data":"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7"} Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.878780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"820c8c37-f45a-465e-bdfb-914e9dd5e209","Type":"ContainerDied","Data":"1f2585162533f465613cea70a254fea1c5e5a6e15af580a351af80b6eb4a6d09"} Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.892454 4921 scope.go:117] "RemoveContainer" containerID="7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.900771 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.906411 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.906458 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmxwq\" (UniqueName: \"kubernetes.io/projected/60ece332-11ab-4c10-93c1-587177ab25cc-kube-api-access-cmxwq\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.906469 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ece332-11ab-4c10-93c1-587177ab25cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.918028 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.933694 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.934102 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-metadata" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934172 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-metadata" Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.934193 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fb77e2-32ca-40d1-a597-47c8ebe2a41a" containerName="nova-manage" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fb77e2-32ca-40d1-a597-47c8ebe2a41a" containerName="nova-manage" Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.934214 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-log" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934222 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-log" Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.934248 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-api" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934256 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-api" Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.934271 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-log" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934279 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-log" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934462 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-api" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934482 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" containerName="nova-api-log" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934494 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-metadata" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934503 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fb77e2-32ca-40d1-a597-47c8ebe2a41a" containerName="nova-manage" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.934512 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" containerName="nova-metadata-log" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.935493 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.937904 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.946802 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.947768 4921 scope.go:117] "RemoveContainer" containerID="cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce" Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.951314 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce\": container with ID starting with cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce not found: ID does not exist" containerID="cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.951354 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce"} err="failed to get container status \"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce\": rpc error: code = NotFound desc = could not find container \"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce\": container with ID starting with cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce not found: ID does not exist" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.951379 4921 scope.go:117] "RemoveContainer" containerID="7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10" Mar 18 13:45:27 crc kubenswrapper[4921]: E0318 13:45:27.953088 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10\": container with ID starting with 7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10 not found: ID does not exist" containerID="7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.953162 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10"} err="failed to get container status \"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10\": rpc error: code = NotFound desc = could not find container \"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10\": container with ID starting with 7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10 not found: ID does not exist" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.953194 4921 scope.go:117] "RemoveContainer" containerID="cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.953867 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce"} err="failed to get container status \"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce\": rpc error: code = NotFound desc = could not find container \"cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce\": container with ID starting with cf6c5264564776ced8431c98b6490aa6937de89477d66266ce0a68f907148bce not found: ID does not exist" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.953909 4921 scope.go:117] "RemoveContainer" containerID="7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.954242 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10"} err="failed to get container status \"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10\": rpc error: code = NotFound desc = could not find container \"7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10\": container with ID starting with 7baeb6c9bdadac023a9d7676a0aac86ffc67e55dc1546dea7194b777234f1a10 not found: ID does not exist" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.954264 4921 scope.go:117] "RemoveContainer" containerID="da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758" Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.964333 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.996548 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:27 crc kubenswrapper[4921]: I0318 13:45:27.999286 4921 scope.go:117] "RemoveContainer" containerID="65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.007143 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.008251 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-config-data\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.008299 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95766957-d3ab-453a-a198-6a199b9ddd71-logs\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.008358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.008441 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fj7n\" (UniqueName: \"kubernetes.io/projected/95766957-d3ab-453a-a198-6a199b9ddd71-kube-api-access-5fj7n\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.011555 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.013483 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.044266 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.058205 4921 scope.go:117] "RemoveContainer" containerID="da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758" Mar 18 13:45:28 crc kubenswrapper[4921]: E0318 13:45:28.059312 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758\": container with ID starting with da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758 not found: ID does not exist" containerID="da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.059358 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758"} err="failed to get container status \"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758\": rpc error: code = NotFound desc = could not find container \"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758\": container with ID starting with da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758 not found: ID does not exist" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.059385 4921 scope.go:117] "RemoveContainer" containerID="65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7" Mar 18 13:45:28 crc kubenswrapper[4921]: E0318 13:45:28.060005 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7\": container with ID starting with 65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7 not found: ID does not exist" containerID="65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.060037 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7"} err="failed to get container status \"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7\": rpc error: code = NotFound desc = could not find container \"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7\": container with ID starting with 65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7 not found: ID does not exist" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.060055 4921 scope.go:117] "RemoveContainer" containerID="da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.060470 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758"} err="failed to get container status \"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758\": rpc error: code = NotFound desc = could not find container \"da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758\": container with ID starting with da67f5fedfbe5a90b014021bbfb5ee2eb802752c10f6d9d2b9f56d9e1cc7f758 not found: ID does not exist" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.060493 4921 scope.go:117] "RemoveContainer" containerID="65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.060758 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7"} err="failed to get container status \"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7\": rpc error: code = NotFound desc = could not find container \"65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7\": container with ID starting with 65b27590039cad3a4493930476457a9afe874152316704a5e1db6a0093c01fa7 not found: ID does not exist" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.110884 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskb5\" (UniqueName: \"kubernetes.io/projected/4efa1435-f42b-493d-8cae-4b2c25bec8f0-kube-api-access-xskb5\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111208 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-config-data\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111243 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111269 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efa1435-f42b-493d-8cae-4b2c25bec8f0-logs\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fj7n\" (UniqueName: \"kubernetes.io/projected/95766957-d3ab-453a-a198-6a199b9ddd71-kube-api-access-5fj7n\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111414 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-config-data\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.111461 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95766957-d3ab-453a-a198-6a199b9ddd71-logs\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.112318 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95766957-d3ab-453a-a198-6a199b9ddd71-logs\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.118062 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.118226 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-config-data\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.131674 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fj7n\" (UniqueName: \"kubernetes.io/projected/95766957-d3ab-453a-a198-6a199b9ddd71-kube-api-access-5fj7n\") pod \"nova-metadata-0\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.214688 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xskb5\" (UniqueName: \"kubernetes.io/projected/4efa1435-f42b-493d-8cae-4b2c25bec8f0-kube-api-access-xskb5\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.215278 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-config-data\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.215416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efa1435-f42b-493d-8cae-4b2c25bec8f0-logs\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.215534 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.222914 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efa1435-f42b-493d-8cae-4b2c25bec8f0-logs\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.231482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.232268 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-config-data\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.236468 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xskb5\" (UniqueName: \"kubernetes.io/projected/4efa1435-f42b-493d-8cae-4b2c25bec8f0-kube-api-access-xskb5\") pod \"nova-api-0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.274925 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.328288 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.340697 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.343047 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.371308 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.425557 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-8f6ph"] Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.425828 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" containerName="dnsmasq-dns" containerID="cri-o://630f0fb1f6c1b5baad5beba796c852444f5afb0d92af123423726c1973ba554d" gracePeriod=10 Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.833627 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.909633 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95766957-d3ab-453a-a198-6a199b9ddd71","Type":"ContainerStarted","Data":"90d981df24948c643ac3b86cc697fff998c73bb6053c14124526d15383b10bac"} Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.928768 4921 generic.go:334] "Generic (PLEG): container finished" podID="26284230-21f3-4580-ac05-967a2b6c63ac" containerID="630f0fb1f6c1b5baad5beba796c852444f5afb0d92af123423726c1973ba554d" exitCode=0 Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.928844 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" event={"ID":"26284230-21f3-4580-ac05-967a2b6c63ac","Type":"ContainerDied","Data":"630f0fb1f6c1b5baad5beba796c852444f5afb0d92af123423726c1973ba554d"} Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.944836 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.946434 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:45:28 crc kubenswrapper[4921]: W0318 13:45:28.973758 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4efa1435_f42b_493d_8cae_4b2c25bec8f0.slice/crio-efd1b920882ef073935e3588335cb943cc03d74f076c3f975336e5a50fc5fd77 WatchSource:0}: Error finding container efd1b920882ef073935e3588335cb943cc03d74f076c3f975336e5a50fc5fd77: Status 404 returned error can't find the container with id efd1b920882ef073935e3588335cb943cc03d74f076c3f975336e5a50fc5fd77 Mar 18 13:45:28 crc kubenswrapper[4921]: I0318 13:45:28.991240 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.037414 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-dns-svc\") pod \"26284230-21f3-4580-ac05-967a2b6c63ac\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.037541 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tmnk\" (UniqueName: \"kubernetes.io/projected/26284230-21f3-4580-ac05-967a2b6c63ac-kube-api-access-4tmnk\") pod \"26284230-21f3-4580-ac05-967a2b6c63ac\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.037652 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-nb\") pod \"26284230-21f3-4580-ac05-967a2b6c63ac\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.037729 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-config\") pod \"26284230-21f3-4580-ac05-967a2b6c63ac\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.037803 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-sb\") pod \"26284230-21f3-4580-ac05-967a2b6c63ac\" (UID: \"26284230-21f3-4580-ac05-967a2b6c63ac\") " Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.058672 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26284230-21f3-4580-ac05-967a2b6c63ac-kube-api-access-4tmnk" (OuterVolumeSpecName: "kube-api-access-4tmnk") pod "26284230-21f3-4580-ac05-967a2b6c63ac" (UID: "26284230-21f3-4580-ac05-967a2b6c63ac"). InnerVolumeSpecName "kube-api-access-4tmnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.113106 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "26284230-21f3-4580-ac05-967a2b6c63ac" (UID: "26284230-21f3-4580-ac05-967a2b6c63ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.114388 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-config" (OuterVolumeSpecName: "config") pod "26284230-21f3-4580-ac05-967a2b6c63ac" (UID: "26284230-21f3-4580-ac05-967a2b6c63ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.120504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26284230-21f3-4580-ac05-967a2b6c63ac" (UID: "26284230-21f3-4580-ac05-967a2b6c63ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.139638 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "26284230-21f3-4580-ac05-967a2b6c63ac" (UID: "26284230-21f3-4580-ac05-967a2b6c63ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.140408 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.140447 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.140460 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tmnk\" (UniqueName: \"kubernetes.io/projected/26284230-21f3-4580-ac05-967a2b6c63ac-kube-api-access-4tmnk\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.140470 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.140479 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26284230-21f3-4580-ac05-967a2b6c63ac-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.238439 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ece332-11ab-4c10-93c1-587177ab25cc" path="/var/lib/kubelet/pods/60ece332-11ab-4c10-93c1-587177ab25cc/volumes" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.239175 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820c8c37-f45a-465e-bdfb-914e9dd5e209" path="/var/lib/kubelet/pods/820c8c37-f45a-465e-bdfb-914e9dd5e209/volumes" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.628617 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pdm2f"] Mar 18 13:45:29 crc kubenswrapper[4921]: E0318 13:45:29.629425 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" containerName="dnsmasq-dns" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.629446 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" containerName="dnsmasq-dns" Mar 18 13:45:29 crc kubenswrapper[4921]: E0318 13:45:29.629476 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" containerName="init" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.629486 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" containerName="init" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.629690 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" containerName="dnsmasq-dns" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.630985 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.637408 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pdm2f"] Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.747875 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-utilities\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.748071 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpz6x\" (UniqueName: \"kubernetes.io/projected/c0ae89fb-75a7-4610-ae64-8304470d74eb-kube-api-access-tpz6x\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.748328 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-catalog-content\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.850615 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-utilities\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.850712 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpz6x\" (UniqueName: \"kubernetes.io/projected/c0ae89fb-75a7-4610-ae64-8304470d74eb-kube-api-access-tpz6x\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.850781 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-catalog-content\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.851246 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-utilities\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.851324 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-catalog-content\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.871420 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpz6x\" (UniqueName: \"kubernetes.io/projected/c0ae89fb-75a7-4610-ae64-8304470d74eb-kube-api-access-tpz6x\") pod \"community-operators-pdm2f\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.940256 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4efa1435-f42b-493d-8cae-4b2c25bec8f0","Type":"ContainerStarted","Data":"0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1"} Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.940315 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4efa1435-f42b-493d-8cae-4b2c25bec8f0","Type":"ContainerStarted","Data":"a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5"} Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.940331 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4efa1435-f42b-493d-8cae-4b2c25bec8f0","Type":"ContainerStarted","Data":"efd1b920882ef073935e3588335cb943cc03d74f076c3f975336e5a50fc5fd77"} Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.941996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95766957-d3ab-453a-a198-6a199b9ddd71","Type":"ContainerStarted","Data":"e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528"} Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.942032 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95766957-d3ab-453a-a198-6a199b9ddd71","Type":"ContainerStarted","Data":"4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203"} Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.943705 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" event={"ID":"26284230-21f3-4580-ac05-967a2b6c63ac","Type":"ContainerDied","Data":"1197debbeaafe07252c2dd276443483ffd3afa31254498e49dad820e025fe3c3"} Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.943782 4921 scope.go:117] "RemoveContainer" containerID="630f0fb1f6c1b5baad5beba796c852444f5afb0d92af123423726c1973ba554d" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.943726 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c649d7bf-8f6ph" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.962824 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.96280754 podStartE2EDuration="2.96280754s" podCreationTimestamp="2026-03-18 13:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:29.9592865 +0000 UTC m=+5749.509207139" watchObservedRunningTime="2026-03-18 13:45:29.96280754 +0000 UTC m=+5749.512728179" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.965737 4921 scope.go:117] "RemoveContainer" containerID="60de801e0764e55095358917189e34a945936ba38e159b606379d4b4625325df" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.983157 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-8f6ph"] Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.996900 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:29 crc kubenswrapper[4921]: I0318 13:45:29.997907 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85c649d7bf-8f6ph"] Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.007410 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.007385793 podStartE2EDuration="3.007385793s" podCreationTimestamp="2026-03-18 13:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:29.990855661 +0000 UTC m=+5749.540776300" watchObservedRunningTime="2026-03-18 13:45:30.007385793 +0000 UTC m=+5749.557306432" Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.564748 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pdm2f"] Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.954157 4921 generic.go:334] "Generic (PLEG): container finished" podID="df87e599-31a3-47be-b740-25701e5e5ca8" containerID="a786fd4d469c04a8d6fb60df4c5d28075b32545d415f56854c6f984b0a09fe53" exitCode=0 Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.954213 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"df87e599-31a3-47be-b740-25701e5e5ca8","Type":"ContainerDied","Data":"a786fd4d469c04a8d6fb60df4c5d28075b32545d415f56854c6f984b0a09fe53"} Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.956223 4921 generic.go:334] "Generic (PLEG): container finished" podID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerID="29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731" exitCode=0 Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.956296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdm2f" event={"ID":"c0ae89fb-75a7-4610-ae64-8304470d74eb","Type":"ContainerDied","Data":"29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731"} Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.956360 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdm2f" event={"ID":"c0ae89fb-75a7-4610-ae64-8304470d74eb","Type":"ContainerStarted","Data":"1f02f6b32afa2ed1bb624ced22d35fa5fb8985e578fe0c6b4f79917080c1c0a9"} Mar 18 13:45:30 crc kubenswrapper[4921]: I0318 13:45:30.960143 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.243527 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26284230-21f3-4580-ac05-967a2b6c63ac" path="/var/lib/kubelet/pods/26284230-21f3-4580-ac05-967a2b6c63ac/volumes" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.254806 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.287891 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.389707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-combined-ca-bundle\") pod \"df87e599-31a3-47be-b740-25701e5e5ca8\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.389930 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqtw7\" (UniqueName: \"kubernetes.io/projected/df87e599-31a3-47be-b740-25701e5e5ca8-kube-api-access-cqtw7\") pod \"df87e599-31a3-47be-b740-25701e5e5ca8\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.389993 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-config-data\") pod \"df87e599-31a3-47be-b740-25701e5e5ca8\" (UID: \"df87e599-31a3-47be-b740-25701e5e5ca8\") " Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.395247 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df87e599-31a3-47be-b740-25701e5e5ca8-kube-api-access-cqtw7" (OuterVolumeSpecName: "kube-api-access-cqtw7") pod "df87e599-31a3-47be-b740-25701e5e5ca8" (UID: "df87e599-31a3-47be-b740-25701e5e5ca8"). InnerVolumeSpecName "kube-api-access-cqtw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.416752 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-config-data" (OuterVolumeSpecName: "config-data") pod "df87e599-31a3-47be-b740-25701e5e5ca8" (UID: "df87e599-31a3-47be-b740-25701e5e5ca8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.420563 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df87e599-31a3-47be-b740-25701e5e5ca8" (UID: "df87e599-31a3-47be-b740-25701e5e5ca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.491695 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.491737 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df87e599-31a3-47be-b740-25701e5e5ca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.491756 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqtw7\" (UniqueName: \"kubernetes.io/projected/df87e599-31a3-47be-b740-25701e5e5ca8-kube-api-access-cqtw7\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.706947 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xplfh"] Mar 18 13:45:31 crc kubenswrapper[4921]: E0318 13:45:31.707371 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df87e599-31a3-47be-b740-25701e5e5ca8" containerName="nova-scheduler-scheduler" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.707387 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="df87e599-31a3-47be-b740-25701e5e5ca8" containerName="nova-scheduler-scheduler" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.707577 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="df87e599-31a3-47be-b740-25701e5e5ca8" containerName="nova-scheduler-scheduler" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.708219 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.710340 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.710635 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.715524 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xplfh"] Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.796259 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-config-data\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.796527 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlq2w\" (UniqueName: \"kubernetes.io/projected/645f51e9-2ec5-4a14-80fd-0cdab30d1675-kube-api-access-dlq2w\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.796642 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.796734 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-scripts\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.898008 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-scripts\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.898456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-config-data\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.898531 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlq2w\" (UniqueName: \"kubernetes.io/projected/645f51e9-2ec5-4a14-80fd-0cdab30d1675-kube-api-access-dlq2w\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.898589 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.904387 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-config-data\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.905227 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-scripts\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.905672 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.915812 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlq2w\" (UniqueName: \"kubernetes.io/projected/645f51e9-2ec5-4a14-80fd-0cdab30d1675-kube-api-access-dlq2w\") pod \"nova-cell1-cell-mapping-xplfh\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.966697 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"df87e599-31a3-47be-b740-25701e5e5ca8","Type":"ContainerDied","Data":"16b76e5869d1129b1311086a06a16fbd4e71c42898b69868c583d0895c33066e"} Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.967871 4921 scope.go:117] "RemoveContainer" containerID="a786fd4d469c04a8d6fb60df4c5d28075b32545d415f56854c6f984b0a09fe53" Mar 18 13:45:31 crc kubenswrapper[4921]: I0318 13:45:31.966747 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.009824 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.032464 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.041240 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.054711 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.056037 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.069793 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.076497 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.203138 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-config-data\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.203533 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.203618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvt5\" (UniqueName: \"kubernetes.io/projected/75d9974a-5415-487e-8eff-cad4b30b76c6-kube-api-access-blvt5\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.305336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.305403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blvt5\" (UniqueName: \"kubernetes.io/projected/75d9974a-5415-487e-8eff-cad4b30b76c6-kube-api-access-blvt5\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.305496 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-config-data\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.309989 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-config-data\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.311824 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.323539 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvt5\" (UniqueName: \"kubernetes.io/projected/75d9974a-5415-487e-8eff-cad4b30b76c6-kube-api-access-blvt5\") pod \"nova-scheduler-0\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.394391 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.508630 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xplfh"] Mar 18 13:45:32 crc kubenswrapper[4921]: W0318 13:45:32.512009 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod645f51e9_2ec5_4a14_80fd_0cdab30d1675.slice/crio-2d7ac1f8677958ffc1d46e7d1bbe94371eeb1a37659a78822403da3629f3e67b WatchSource:0}: Error finding container 2d7ac1f8677958ffc1d46e7d1bbe94371eeb1a37659a78822403da3629f3e67b: Status 404 returned error can't find the container with id 2d7ac1f8677958ffc1d46e7d1bbe94371eeb1a37659a78822403da3629f3e67b Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.812209 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:32 crc kubenswrapper[4921]: W0318 13:45:32.814062 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75d9974a_5415_487e_8eff_cad4b30b76c6.slice/crio-422625ae774436db7d589680891432e3b0c3335999a0d4a6999bb960d103491f WatchSource:0}: Error finding container 422625ae774436db7d589680891432e3b0c3335999a0d4a6999bb960d103491f: Status 404 returned error can't find the container with id 422625ae774436db7d589680891432e3b0c3335999a0d4a6999bb960d103491f Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.976528 4921 generic.go:334] "Generic (PLEG): container finished" podID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerID="a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c" exitCode=0 Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.976592 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdm2f" event={"ID":"c0ae89fb-75a7-4610-ae64-8304470d74eb","Type":"ContainerDied","Data":"a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c"} Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.981438 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xplfh" event={"ID":"645f51e9-2ec5-4a14-80fd-0cdab30d1675","Type":"ContainerStarted","Data":"17ae14acdc66e55d38a0ff3c961fd7c68afac34991ffb89928aa230c27349a56"} Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.981477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xplfh" event={"ID":"645f51e9-2ec5-4a14-80fd-0cdab30d1675","Type":"ContainerStarted","Data":"2d7ac1f8677958ffc1d46e7d1bbe94371eeb1a37659a78822403da3629f3e67b"} Mar 18 13:45:32 crc kubenswrapper[4921]: I0318 13:45:32.987031 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d9974a-5415-487e-8eff-cad4b30b76c6","Type":"ContainerStarted","Data":"422625ae774436db7d589680891432e3b0c3335999a0d4a6999bb960d103491f"} Mar 18 13:45:33 crc kubenswrapper[4921]: I0318 13:45:33.026224 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xplfh" podStartSLOduration=2.026204276 podStartE2EDuration="2.026204276s" podCreationTimestamp="2026-03-18 13:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:33.016855619 +0000 UTC m=+5752.566776268" watchObservedRunningTime="2026-03-18 13:45:33.026204276 +0000 UTC m=+5752.576124925" Mar 18 13:45:33 crc kubenswrapper[4921]: I0318 13:45:33.219424 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df87e599-31a3-47be-b740-25701e5e5ca8" path="/var/lib/kubelet/pods/df87e599-31a3-47be-b740-25701e5e5ca8/volumes" Mar 18 13:45:34 crc kubenswrapper[4921]: I0318 13:45:34.003894 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdm2f" event={"ID":"c0ae89fb-75a7-4610-ae64-8304470d74eb","Type":"ContainerStarted","Data":"aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3"} Mar 18 13:45:34 crc kubenswrapper[4921]: I0318 13:45:34.015423 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d9974a-5415-487e-8eff-cad4b30b76c6","Type":"ContainerStarted","Data":"f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507"} Mar 18 13:45:34 crc kubenswrapper[4921]: I0318 13:45:34.031937 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pdm2f" podStartSLOduration=2.481777475 podStartE2EDuration="5.031920938s" podCreationTimestamp="2026-03-18 13:45:29 +0000 UTC" firstStartedPulling="2026-03-18 13:45:30.959818214 +0000 UTC m=+5750.509738853" lastFinishedPulling="2026-03-18 13:45:33.509961677 +0000 UTC m=+5753.059882316" observedRunningTime="2026-03-18 13:45:34.022389636 +0000 UTC m=+5753.572310275" watchObservedRunningTime="2026-03-18 13:45:34.031920938 +0000 UTC m=+5753.581841597" Mar 18 13:45:34 crc kubenswrapper[4921]: I0318 13:45:34.043855 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.043834618 podStartE2EDuration="2.043834618s" podCreationTimestamp="2026-03-18 13:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:34.042333255 +0000 UTC m=+5753.592253894" watchObservedRunningTime="2026-03-18 13:45:34.043834618 +0000 UTC m=+5753.593755257" Mar 18 13:45:35 crc kubenswrapper[4921]: I0318 13:45:35.209166 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:45:35 crc kubenswrapper[4921]: E0318 13:45:35.209735 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:45:37 crc kubenswrapper[4921]: I0318 13:45:37.395274 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:45:38 crc kubenswrapper[4921]: I0318 13:45:38.053819 4921 generic.go:334] "Generic (PLEG): container finished" podID="645f51e9-2ec5-4a14-80fd-0cdab30d1675" containerID="17ae14acdc66e55d38a0ff3c961fd7c68afac34991ffb89928aa230c27349a56" exitCode=0 Mar 18 13:45:38 crc kubenswrapper[4921]: I0318 13:45:38.053914 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xplfh" event={"ID":"645f51e9-2ec5-4a14-80fd-0cdab30d1675","Type":"ContainerDied","Data":"17ae14acdc66e55d38a0ff3c961fd7c68afac34991ffb89928aa230c27349a56"} Mar 18 13:45:38 crc kubenswrapper[4921]: I0318 13:45:38.275844 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:45:38 crc kubenswrapper[4921]: I0318 13:45:38.275915 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:45:38 crc kubenswrapper[4921]: I0318 13:45:38.344075 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:45:38 crc kubenswrapper[4921]: I0318 13:45:38.345582 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.318768 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.96:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.318779 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.96:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.418896 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.428348 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.428435 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.542224 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlq2w\" (UniqueName: \"kubernetes.io/projected/645f51e9-2ec5-4a14-80fd-0cdab30d1675-kube-api-access-dlq2w\") pod \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.542339 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-combined-ca-bundle\") pod \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.542632 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-config-data\") pod \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.542789 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-scripts\") pod \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\" (UID: \"645f51e9-2ec5-4a14-80fd-0cdab30d1675\") " Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.547243 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-scripts" (OuterVolumeSpecName: "scripts") pod "645f51e9-2ec5-4a14-80fd-0cdab30d1675" (UID: "645f51e9-2ec5-4a14-80fd-0cdab30d1675"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.571344 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645f51e9-2ec5-4a14-80fd-0cdab30d1675-kube-api-access-dlq2w" (OuterVolumeSpecName: "kube-api-access-dlq2w") pod "645f51e9-2ec5-4a14-80fd-0cdab30d1675" (UID: "645f51e9-2ec5-4a14-80fd-0cdab30d1675"). InnerVolumeSpecName "kube-api-access-dlq2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.575268 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "645f51e9-2ec5-4a14-80fd-0cdab30d1675" (UID: "645f51e9-2ec5-4a14-80fd-0cdab30d1675"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.582686 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-config-data" (OuterVolumeSpecName: "config-data") pod "645f51e9-2ec5-4a14-80fd-0cdab30d1675" (UID: "645f51e9-2ec5-4a14-80fd-0cdab30d1675"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.644802 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.644837 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.644846 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlq2w\" (UniqueName: \"kubernetes.io/projected/645f51e9-2ec5-4a14-80fd-0cdab30d1675-kube-api-access-dlq2w\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.644855 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645f51e9-2ec5-4a14-80fd-0cdab30d1675-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.997159 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:39 crc kubenswrapper[4921]: I0318 13:45:39.997212 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.048667 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.079868 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xplfh" Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.079874 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xplfh" event={"ID":"645f51e9-2ec5-4a14-80fd-0cdab30d1675","Type":"ContainerDied","Data":"2d7ac1f8677958ffc1d46e7d1bbe94371eeb1a37659a78822403da3629f3e67b"} Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.079930 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d7ac1f8677958ffc1d46e7d1bbe94371eeb1a37659a78822403da3629f3e67b" Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.147285 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.253778 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.253995 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-log" containerID="cri-o://a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5" gracePeriod=30 Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.254224 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-api" containerID="cri-o://0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1" gracePeriod=30 Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.286416 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.286706 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="75d9974a-5415-487e-8eff-cad4b30b76c6" containerName="nova-scheduler-scheduler" containerID="cri-o://f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507" gracePeriod=30 Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.300553 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pdm2f"] Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.328791 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.329026 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-log" containerID="cri-o://4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203" gracePeriod=30 Mar 18 13:45:40 crc kubenswrapper[4921]: I0318 13:45:40.329288 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-metadata" containerID="cri-o://e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528" gracePeriod=30 Mar 18 13:45:41 crc kubenswrapper[4921]: I0318 13:45:41.089688 4921 generic.go:334] "Generic (PLEG): container finished" podID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerID="a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5" exitCode=143 Mar 18 13:45:41 crc kubenswrapper[4921]: I0318 13:45:41.089774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4efa1435-f42b-493d-8cae-4b2c25bec8f0","Type":"ContainerDied","Data":"a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5"} Mar 18 13:45:41 crc kubenswrapper[4921]: I0318 13:45:41.092109 4921 generic.go:334] "Generic (PLEG): container finished" podID="95766957-d3ab-453a-a198-6a199b9ddd71" containerID="4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203" exitCode=143 Mar 18 13:45:41 crc kubenswrapper[4921]: I0318 13:45:41.092237 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95766957-d3ab-453a-a198-6a199b9ddd71","Type":"ContainerDied","Data":"4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203"} Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.101388 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pdm2f" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="registry-server" containerID="cri-o://aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3" gracePeriod=2 Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.649451 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.711756 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpz6x\" (UniqueName: \"kubernetes.io/projected/c0ae89fb-75a7-4610-ae64-8304470d74eb-kube-api-access-tpz6x\") pod \"c0ae89fb-75a7-4610-ae64-8304470d74eb\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.711950 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-utilities\") pod \"c0ae89fb-75a7-4610-ae64-8304470d74eb\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.712084 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-catalog-content\") pod \"c0ae89fb-75a7-4610-ae64-8304470d74eb\" (UID: \"c0ae89fb-75a7-4610-ae64-8304470d74eb\") " Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.719732 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-utilities" (OuterVolumeSpecName: "utilities") pod "c0ae89fb-75a7-4610-ae64-8304470d74eb" (UID: "c0ae89fb-75a7-4610-ae64-8304470d74eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.730093 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ae89fb-75a7-4610-ae64-8304470d74eb-kube-api-access-tpz6x" (OuterVolumeSpecName: "kube-api-access-tpz6x") pod "c0ae89fb-75a7-4610-ae64-8304470d74eb" (UID: "c0ae89fb-75a7-4610-ae64-8304470d74eb"). InnerVolumeSpecName "kube-api-access-tpz6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.781457 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0ae89fb-75a7-4610-ae64-8304470d74eb" (UID: "c0ae89fb-75a7-4610-ae64-8304470d74eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.814724 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.814801 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpz6x\" (UniqueName: \"kubernetes.io/projected/c0ae89fb-75a7-4610-ae64-8304470d74eb-kube-api-access-tpz6x\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:42 crc kubenswrapper[4921]: I0318 13:45:42.814819 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0ae89fb-75a7-4610-ae64-8304470d74eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.110647 4921 generic.go:334] "Generic (PLEG): container finished" podID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerID="aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3" exitCode=0 Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.110741 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pdm2f" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.112006 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdm2f" event={"ID":"c0ae89fb-75a7-4610-ae64-8304470d74eb","Type":"ContainerDied","Data":"aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3"} Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.112190 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pdm2f" event={"ID":"c0ae89fb-75a7-4610-ae64-8304470d74eb","Type":"ContainerDied","Data":"1f02f6b32afa2ed1bb624ced22d35fa5fb8985e578fe0c6b4f79917080c1c0a9"} Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.112301 4921 scope.go:117] "RemoveContainer" containerID="aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.132486 4921 scope.go:117] "RemoveContainer" containerID="a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.156038 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pdm2f"] Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.161849 4921 scope.go:117] "RemoveContainer" containerID="29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.167658 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pdm2f"] Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.202201 4921 scope.go:117] "RemoveContainer" containerID="aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3" Mar 18 13:45:43 crc kubenswrapper[4921]: E0318 13:45:43.203390 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3\": container with ID starting with aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3 not found: ID does not exist" containerID="aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.203424 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3"} err="failed to get container status \"aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3\": rpc error: code = NotFound desc = could not find container \"aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3\": container with ID starting with aa8ff6c6d8039a7cc16c821d4acb3e83d87a86b62fca08a5a3649d0b838e1aa3 not found: ID does not exist" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.203445 4921 scope.go:117] "RemoveContainer" containerID="a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c" Mar 18 13:45:43 crc kubenswrapper[4921]: E0318 13:45:43.203713 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c\": container with ID starting with a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c not found: ID does not exist" containerID="a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.203760 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c"} err="failed to get container status \"a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c\": rpc error: code = NotFound desc = could not find container \"a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c\": container with ID starting with a4b1749f852c4122a74ad4c0c8e2b82faa9be9973d646f3ec11ec5763dd9993c not found: ID does not exist" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.203784 4921 scope.go:117] "RemoveContainer" containerID="29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731" Mar 18 13:45:43 crc kubenswrapper[4921]: E0318 13:45:43.204093 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731\": container with ID starting with 29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731 not found: ID does not exist" containerID="29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.204133 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731"} err="failed to get container status \"29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731\": rpc error: code = NotFound desc = could not find container \"29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731\": container with ID starting with 29e585e69169cec833c34717be4f3fa9c920444d8fbab79b4d7865382f29d731 not found: ID does not exist" Mar 18 13:45:43 crc kubenswrapper[4921]: I0318 13:45:43.217890 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" path="/var/lib/kubelet/pods/c0ae89fb-75a7-4610-ae64-8304470d74eb/volumes" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.084134 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.124847 4921 generic.go:334] "Generic (PLEG): container finished" podID="95766957-d3ab-453a-a198-6a199b9ddd71" containerID="e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528" exitCode=0 Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.124917 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.124948 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95766957-d3ab-453a-a198-6a199b9ddd71","Type":"ContainerDied","Data":"e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528"} Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.125405 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95766957-d3ab-453a-a198-6a199b9ddd71","Type":"ContainerDied","Data":"90d981df24948c643ac3b86cc697fff998c73bb6053c14124526d15383b10bac"} Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.125425 4921 scope.go:117] "RemoveContainer" containerID="e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.156615 4921 scope.go:117] "RemoveContainer" containerID="4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.176409 4921 scope.go:117] "RemoveContainer" containerID="e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.176843 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528\": container with ID starting with e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528 not found: ID does not exist" containerID="e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.176989 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528"} err="failed to get container status \"e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528\": rpc error: code = NotFound desc = could not find container \"e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528\": container with ID starting with e379f1e0fb0d7efece9e31a1acb2156fa1434a122c191422d9e8f53d0a9c2528 not found: ID does not exist" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.177017 4921 scope.go:117] "RemoveContainer" containerID="4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.177353 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203\": container with ID starting with 4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203 not found: ID does not exist" containerID="4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.177384 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203"} err="failed to get container status \"4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203\": rpc error: code = NotFound desc = could not find container \"4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203\": container with ID starting with 4aefc1b3f8fabf84149c079724995dc944cc136667b670e488b6660997c45203 not found: ID does not exist" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.252225 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95766957-d3ab-453a-a198-6a199b9ddd71-logs\") pod \"95766957-d3ab-453a-a198-6a199b9ddd71\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.252476 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-config-data\") pod \"95766957-d3ab-453a-a198-6a199b9ddd71\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.252513 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-combined-ca-bundle\") pod \"95766957-d3ab-453a-a198-6a199b9ddd71\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.252570 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fj7n\" (UniqueName: \"kubernetes.io/projected/95766957-d3ab-453a-a198-6a199b9ddd71-kube-api-access-5fj7n\") pod \"95766957-d3ab-453a-a198-6a199b9ddd71\" (UID: \"95766957-d3ab-453a-a198-6a199b9ddd71\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.253442 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95766957-d3ab-453a-a198-6a199b9ddd71-logs" (OuterVolumeSpecName: "logs") pod "95766957-d3ab-453a-a198-6a199b9ddd71" (UID: "95766957-d3ab-453a-a198-6a199b9ddd71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.259541 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95766957-d3ab-453a-a198-6a199b9ddd71-kube-api-access-5fj7n" (OuterVolumeSpecName: "kube-api-access-5fj7n") pod "95766957-d3ab-453a-a198-6a199b9ddd71" (UID: "95766957-d3ab-453a-a198-6a199b9ddd71"). InnerVolumeSpecName "kube-api-access-5fj7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.278272 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-config-data" (OuterVolumeSpecName: "config-data") pod "95766957-d3ab-453a-a198-6a199b9ddd71" (UID: "95766957-d3ab-453a-a198-6a199b9ddd71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.287836 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95766957-d3ab-453a-a198-6a199b9ddd71" (UID: "95766957-d3ab-453a-a198-6a199b9ddd71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.354762 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95766957-d3ab-453a-a198-6a199b9ddd71-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.354809 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.354821 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95766957-d3ab-453a-a198-6a199b9ddd71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.354834 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fj7n\" (UniqueName: \"kubernetes.io/projected/95766957-d3ab-453a-a198-6a199b9ddd71-kube-api-access-5fj7n\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.480967 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.499159 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.516569 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.525074 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-log" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525136 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-log" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.525154 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="extract-utilities" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525162 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="extract-utilities" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.525182 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="extract-content" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525190 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="extract-content" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.525199 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-metadata" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525206 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-metadata" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.525234 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="registry-server" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525244 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="registry-server" Mar 18 13:45:44 crc kubenswrapper[4921]: E0318 13:45:44.525266 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645f51e9-2ec5-4a14-80fd-0cdab30d1675" containerName="nova-manage" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525273 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="645f51e9-2ec5-4a14-80fd-0cdab30d1675" containerName="nova-manage" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525470 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-metadata" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525483 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="645f51e9-2ec5-4a14-80fd-0cdab30d1675" containerName="nova-manage" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525497 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" containerName="nova-metadata-log" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.525510 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ae89fb-75a7-4610-ae64-8304470d74eb" containerName="registry-server" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.526676 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.530562 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.533477 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.660679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.661002 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-config-data\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.661061 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916a791-4604-4652-9dd1-354d91186046-logs\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.661277 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4588\" (UniqueName: \"kubernetes.io/projected/9916a791-4604-4652-9dd1-354d91186046-kube-api-access-t4588\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.764529 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-config-data\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.764614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916a791-4604-4652-9dd1-354d91186046-logs\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.765195 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916a791-4604-4652-9dd1-354d91186046-logs\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.765378 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4588\" (UniqueName: \"kubernetes.io/projected/9916a791-4604-4652-9dd1-354d91186046-kube-api-access-t4588\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.766081 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.769261 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-config-data\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.769304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.781769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4588\" (UniqueName: \"kubernetes.io/projected/9916a791-4604-4652-9dd1-354d91186046-kube-api-access-t4588\") pod \"nova-metadata-0\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.827952 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.879184 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.980288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-config-data\") pod \"75d9974a-5415-487e-8eff-cad4b30b76c6\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.980390 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-combined-ca-bundle\") pod \"75d9974a-5415-487e-8eff-cad4b30b76c6\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.980441 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blvt5\" (UniqueName: \"kubernetes.io/projected/75d9974a-5415-487e-8eff-cad4b30b76c6-kube-api-access-blvt5\") pod \"75d9974a-5415-487e-8eff-cad4b30b76c6\" (UID: \"75d9974a-5415-487e-8eff-cad4b30b76c6\") " Mar 18 13:45:44 crc kubenswrapper[4921]: I0318 13:45:44.988801 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d9974a-5415-487e-8eff-cad4b30b76c6-kube-api-access-blvt5" (OuterVolumeSpecName: "kube-api-access-blvt5") pod "75d9974a-5415-487e-8eff-cad4b30b76c6" (UID: "75d9974a-5415-487e-8eff-cad4b30b76c6"). InnerVolumeSpecName "kube-api-access-blvt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.021894 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75d9974a-5415-487e-8eff-cad4b30b76c6" (UID: "75d9974a-5415-487e-8eff-cad4b30b76c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.035836 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-config-data" (OuterVolumeSpecName: "config-data") pod "75d9974a-5415-487e-8eff-cad4b30b76c6" (UID: "75d9974a-5415-487e-8eff-cad4b30b76c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.084474 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.084768 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d9974a-5415-487e-8eff-cad4b30b76c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.084782 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blvt5\" (UniqueName: \"kubernetes.io/projected/75d9974a-5415-487e-8eff-cad4b30b76c6-kube-api-access-blvt5\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.101258 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.139496 4921 generic.go:334] "Generic (PLEG): container finished" podID="75d9974a-5415-487e-8eff-cad4b30b76c6" containerID="f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507" exitCode=0 Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.139569 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d9974a-5415-487e-8eff-cad4b30b76c6","Type":"ContainerDied","Data":"f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507"} Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.139598 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"75d9974a-5415-487e-8eff-cad4b30b76c6","Type":"ContainerDied","Data":"422625ae774436db7d589680891432e3b0c3335999a0d4a6999bb960d103491f"} Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.139621 4921 scope.go:117] "RemoveContainer" containerID="f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.139735 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.145994 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4efa1435-f42b-493d-8cae-4b2c25bec8f0","Type":"ContainerDied","Data":"0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1"} Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.146015 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.145903 4921 generic.go:334] "Generic (PLEG): container finished" podID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerID="0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1" exitCode=0 Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.148342 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4efa1435-f42b-493d-8cae-4b2c25bec8f0","Type":"ContainerDied","Data":"efd1b920882ef073935e3588335cb943cc03d74f076c3f975336e5a50fc5fd77"} Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.175522 4921 scope.go:117] "RemoveContainer" containerID="f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507" Mar 18 13:45:45 crc kubenswrapper[4921]: E0318 13:45:45.176255 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507\": container with ID starting with f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507 not found: ID does not exist" containerID="f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.176348 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507"} err="failed to get container status \"f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507\": rpc error: code = NotFound desc = could not find container \"f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507\": container with ID starting with f777456d2fd4dba5904a3531e18a9392cd669ae058d2ae2f7d1009e484b9e507 not found: ID does not exist" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.176415 4921 scope.go:117] "RemoveContainer" containerID="0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.193087 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xskb5\" (UniqueName: \"kubernetes.io/projected/4efa1435-f42b-493d-8cae-4b2c25bec8f0-kube-api-access-xskb5\") pod \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.193200 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-config-data\") pod \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.193399 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-combined-ca-bundle\") pod \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.193428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efa1435-f42b-493d-8cae-4b2c25bec8f0-logs\") pod \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\" (UID: \"4efa1435-f42b-493d-8cae-4b2c25bec8f0\") " Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.196136 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efa1435-f42b-493d-8cae-4b2c25bec8f0-logs" (OuterVolumeSpecName: "logs") pod "4efa1435-f42b-493d-8cae-4b2c25bec8f0" (UID: "4efa1435-f42b-493d-8cae-4b2c25bec8f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.200621 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.202103 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efa1435-f42b-493d-8cae-4b2c25bec8f0-kube-api-access-xskb5" (OuterVolumeSpecName: "kube-api-access-xskb5") pod "4efa1435-f42b-493d-8cae-4b2c25bec8f0" (UID: "4efa1435-f42b-493d-8cae-4b2c25bec8f0"). InnerVolumeSpecName "kube-api-access-xskb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.221360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-config-data" (OuterVolumeSpecName: "config-data") pod "4efa1435-f42b-493d-8cae-4b2c25bec8f0" (UID: "4efa1435-f42b-493d-8cae-4b2c25bec8f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.222406 4921 scope.go:117] "RemoveContainer" containerID="a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.235893 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95766957-d3ab-453a-a198-6a199b9ddd71" path="/var/lib/kubelet/pods/95766957-d3ab-453a-a198-6a199b9ddd71/volumes" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241127 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241172 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: E0318 13:45:45.241467 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-log" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241485 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-log" Mar 18 13:45:45 crc kubenswrapper[4921]: E0318 13:45:45.241498 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-api" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241505 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-api" Mar 18 13:45:45 crc kubenswrapper[4921]: E0318 13:45:45.241532 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9974a-5415-487e-8eff-cad4b30b76c6" containerName="nova-scheduler-scheduler" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241538 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9974a-5415-487e-8eff-cad4b30b76c6" containerName="nova-scheduler-scheduler" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241702 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d9974a-5415-487e-8eff-cad4b30b76c6" containerName="nova-scheduler-scheduler" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241715 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-log" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.241731 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" containerName="nova-api-api" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.242418 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.242562 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.244839 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.247033 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4efa1435-f42b-493d-8cae-4b2c25bec8f0" (UID: "4efa1435-f42b-493d-8cae-4b2c25bec8f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.259332 4921 scope.go:117] "RemoveContainer" containerID="0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1" Mar 18 13:45:45 crc kubenswrapper[4921]: E0318 13:45:45.260177 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1\": container with ID starting with 0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1 not found: ID does not exist" containerID="0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.260211 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1"} err="failed to get container status \"0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1\": rpc error: code = NotFound desc = could not find container \"0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1\": container with ID starting with 0f07906fc1570fbf424656f3d6ee907665e7a69e9d264008adbbd12aabc1dad1 not found: ID does not exist" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.260237 4921 scope.go:117] "RemoveContainer" containerID="a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5" Mar 18 13:45:45 crc kubenswrapper[4921]: E0318 13:45:45.260564 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5\": container with ID starting with a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5 not found: ID does not exist" containerID="a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.260595 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5"} err="failed to get container status \"a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5\": rpc error: code = NotFound desc = could not find container \"a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5\": container with ID starting with a6d14d28250d1065d95dec34b4112e8a976ec6c6d6e5cb8c1ad8b8fa27a9bda5 not found: ID does not exist" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.296271 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.296419 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efa1435-f42b-493d-8cae-4b2c25bec8f0-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.296501 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xskb5\" (UniqueName: \"kubernetes.io/projected/4efa1435-f42b-493d-8cae-4b2c25bec8f0-kube-api-access-xskb5\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.296593 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efa1435-f42b-493d-8cae-4b2c25bec8f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.351405 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.399026 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.399250 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4w7\" (UniqueName: \"kubernetes.io/projected/89230afb-39c8-4ada-a134-d329c12c54d9-kube-api-access-4w4w7\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.399466 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-config-data\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.484514 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.498395 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.501194 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.501277 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4w7\" (UniqueName: \"kubernetes.io/projected/89230afb-39c8-4ada-a134-d329c12c54d9-kube-api-access-4w4w7\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.501372 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-config-data\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.505416 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.505438 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-config-data\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.513275 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.515156 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.517753 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.522821 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4w7\" (UniqueName: \"kubernetes.io/projected/89230afb-39c8-4ada-a134-d329c12c54d9-kube-api-access-4w4w7\") pod \"nova-scheduler-0\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.523152 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.619081 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.704821 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-config-data\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.704925 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.705011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c998b6ad-897c-4d19-9216-b6950058a3e2-logs\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.705103 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pm8q\" (UniqueName: \"kubernetes.io/projected/c998b6ad-897c-4d19-9216-b6950058a3e2-kube-api-access-8pm8q\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.806653 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pm8q\" (UniqueName: \"kubernetes.io/projected/c998b6ad-897c-4d19-9216-b6950058a3e2-kube-api-access-8pm8q\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.806930 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-config-data\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.806995 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.807044 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c998b6ad-897c-4d19-9216-b6950058a3e2-logs\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.807454 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c998b6ad-897c-4d19-9216-b6950058a3e2-logs\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.814786 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.816029 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-config-data\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.823760 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pm8q\" (UniqueName: \"kubernetes.io/projected/c998b6ad-897c-4d19-9216-b6950058a3e2-kube-api-access-8pm8q\") pod \"nova-api-0\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " pod="openstack/nova-api-0" Mar 18 13:45:45 crc kubenswrapper[4921]: I0318 13:45:45.893499 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.059567 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.168930 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9916a791-4604-4652-9dd1-354d91186046","Type":"ContainerStarted","Data":"b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828"} Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.169019 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9916a791-4604-4652-9dd1-354d91186046","Type":"ContainerStarted","Data":"965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39"} Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.169032 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9916a791-4604-4652-9dd1-354d91186046","Type":"ContainerStarted","Data":"70257accb71cb34e4fbe6a43b734515ac3ba0b4c320c4e2a4ae3e06768e4fe98"} Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.174539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"89230afb-39c8-4ada-a134-d329c12c54d9","Type":"ContainerStarted","Data":"82de5935d1836c81370803c9d4ad89c3cee7754031402a34539e8441bbd39058"} Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.190818 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.190794934 podStartE2EDuration="2.190794934s" podCreationTimestamp="2026-03-18 13:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:46.187652064 +0000 UTC m=+5765.737572703" watchObservedRunningTime="2026-03-18 13:45:46.190794934 +0000 UTC m=+5765.740715573" Mar 18 13:45:46 crc kubenswrapper[4921]: I0318 13:45:46.367446 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.188795 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c998b6ad-897c-4d19-9216-b6950058a3e2","Type":"ContainerStarted","Data":"d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a"} Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.189392 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c998b6ad-897c-4d19-9216-b6950058a3e2","Type":"ContainerStarted","Data":"f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606"} Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.189418 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c998b6ad-897c-4d19-9216-b6950058a3e2","Type":"ContainerStarted","Data":"302945e096aa37fe658c4346c88e57a28400a8a391c90bfce2fb42af9fca7f54"} Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.192673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"89230afb-39c8-4ada-a134-d329c12c54d9","Type":"ContainerStarted","Data":"62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414"} Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.211984 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.211963637 podStartE2EDuration="2.211963637s" podCreationTimestamp="2026-03-18 13:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:47.209935319 +0000 UTC m=+5766.759855958" watchObservedRunningTime="2026-03-18 13:45:47.211963637 +0000 UTC m=+5766.761884276" Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.221649 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efa1435-f42b-493d-8cae-4b2c25bec8f0" path="/var/lib/kubelet/pods/4efa1435-f42b-493d-8cae-4b2c25bec8f0/volumes" Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.222300 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d9974a-5415-487e-8eff-cad4b30b76c6" path="/var/lib/kubelet/pods/75d9974a-5415-487e-8eff-cad4b30b76c6/volumes" Mar 18 13:45:47 crc kubenswrapper[4921]: I0318 13:45:47.229690 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.229665922 podStartE2EDuration="2.229665922s" podCreationTimestamp="2026-03-18 13:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:47.227921182 +0000 UTC m=+5766.777841821" watchObservedRunningTime="2026-03-18 13:45:47.229665922 +0000 UTC m=+5766.779586561" Mar 18 13:45:50 crc kubenswrapper[4921]: I0318 13:45:50.209057 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:45:50 crc kubenswrapper[4921]: E0318 13:45:50.209707 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:45:50 crc kubenswrapper[4921]: I0318 13:45:50.619650 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:45:54 crc kubenswrapper[4921]: I0318 13:45:54.880406 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:45:54 crc kubenswrapper[4921]: I0318 13:45:54.881009 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:45:55 crc kubenswrapper[4921]: I0318 13:45:55.621234 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 13:45:55 crc kubenswrapper[4921]: I0318 13:45:55.646941 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 13:45:55 crc kubenswrapper[4921]: I0318 13:45:55.893800 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:45:55 crc kubenswrapper[4921]: I0318 13:45:55.893866 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:45:55 crc kubenswrapper[4921]: I0318 13:45:55.962430 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:55 crc kubenswrapper[4921]: I0318 13:45:55.962741 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:56 crc kubenswrapper[4921]: I0318 13:45:56.321052 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 13:45:56 crc kubenswrapper[4921]: I0318 13:45:56.976347 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.103:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:45:56 crc kubenswrapper[4921]: I0318 13:45:56.976360 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.103:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.154663 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564026-6mfj5"] Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.156388 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.159342 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.159459 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.159578 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.166083 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-6mfj5"] Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.293757 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjz4q\" (UniqueName: \"kubernetes.io/projected/30556c0c-ed1c-499b-92a9-bbda4534c1c7-kube-api-access-vjz4q\") pod \"auto-csr-approver-29564026-6mfj5\" (UID: \"30556c0c-ed1c-499b-92a9-bbda4534c1c7\") " pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.395336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjz4q\" (UniqueName: \"kubernetes.io/projected/30556c0c-ed1c-499b-92a9-bbda4534c1c7-kube-api-access-vjz4q\") pod \"auto-csr-approver-29564026-6mfj5\" (UID: \"30556c0c-ed1c-499b-92a9-bbda4534c1c7\") " pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.415344 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjz4q\" (UniqueName: \"kubernetes.io/projected/30556c0c-ed1c-499b-92a9-bbda4534c1c7-kube-api-access-vjz4q\") pod \"auto-csr-approver-29564026-6mfj5\" (UID: \"30556c0c-ed1c-499b-92a9-bbda4534c1c7\") " pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.481654 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:00 crc kubenswrapper[4921]: I0318 13:46:00.906839 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-6mfj5"] Mar 18 13:46:00 crc kubenswrapper[4921]: W0318 13:46:00.910299 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30556c0c_ed1c_499b_92a9_bbda4534c1c7.slice/crio-873260d83598eaa42db3535a57931ca91274d604cb8c63e86d8e14472b46e2f3 WatchSource:0}: Error finding container 873260d83598eaa42db3535a57931ca91274d604cb8c63e86d8e14472b46e2f3: Status 404 returned error can't find the container with id 873260d83598eaa42db3535a57931ca91274d604cb8c63e86d8e14472b46e2f3 Mar 18 13:46:01 crc kubenswrapper[4921]: I0318 13:46:01.341005 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" event={"ID":"30556c0c-ed1c-499b-92a9-bbda4534c1c7","Type":"ContainerStarted","Data":"873260d83598eaa42db3535a57931ca91274d604cb8c63e86d8e14472b46e2f3"} Mar 18 13:46:02 crc kubenswrapper[4921]: I0318 13:46:02.351681 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" event={"ID":"30556c0c-ed1c-499b-92a9-bbda4534c1c7","Type":"ContainerStarted","Data":"42e230358254e0c58e862fcc3a40dcbe8a396e3eb1400160477b2f8a2a4ebfe4"} Mar 18 13:46:02 crc kubenswrapper[4921]: I0318 13:46:02.370625 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" podStartSLOduration=1.338039705 podStartE2EDuration="2.370602333s" podCreationTimestamp="2026-03-18 13:46:00 +0000 UTC" firstStartedPulling="2026-03-18 13:46:00.91332439 +0000 UTC m=+5780.463245029" lastFinishedPulling="2026-03-18 13:46:01.945887018 +0000 UTC m=+5781.495807657" observedRunningTime="2026-03-18 13:46:02.366350282 +0000 UTC m=+5781.916270921" watchObservedRunningTime="2026-03-18 13:46:02.370602333 +0000 UTC m=+5781.920522982" Mar 18 13:46:02 crc kubenswrapper[4921]: I0318 13:46:02.879962 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:46:02 crc kubenswrapper[4921]: I0318 13:46:02.880031 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.209878 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:46:03 crc kubenswrapper[4921]: E0318 13:46:03.210143 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.361937 4921 generic.go:334] "Generic (PLEG): container finished" podID="30556c0c-ed1c-499b-92a9-bbda4534c1c7" containerID="42e230358254e0c58e862fcc3a40dcbe8a396e3eb1400160477b2f8a2a4ebfe4" exitCode=0 Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.361979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" event={"ID":"30556c0c-ed1c-499b-92a9-bbda4534c1c7","Type":"ContainerDied","Data":"42e230358254e0c58e862fcc3a40dcbe8a396e3eb1400160477b2f8a2a4ebfe4"} Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.653993 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8tdrt"] Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.657613 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.662865 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8tdrt"] Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.757083 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-utilities\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.757193 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxntr\" (UniqueName: \"kubernetes.io/projected/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-kube-api-access-kxntr\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.757386 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-catalog-content\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.858998 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxntr\" (UniqueName: \"kubernetes.io/projected/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-kube-api-access-kxntr\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.859176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-catalog-content\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.859299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-utilities\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.859922 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-utilities\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.860081 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-catalog-content\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.884481 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxntr\" (UniqueName: \"kubernetes.io/projected/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-kube-api-access-kxntr\") pod \"redhat-operators-8tdrt\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.894216 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.894374 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:46:03 crc kubenswrapper[4921]: I0318 13:46:03.980677 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.490700 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8tdrt"] Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.650466 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.776388 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjz4q\" (UniqueName: \"kubernetes.io/projected/30556c0c-ed1c-499b-92a9-bbda4534c1c7-kube-api-access-vjz4q\") pod \"30556c0c-ed1c-499b-92a9-bbda4534c1c7\" (UID: \"30556c0c-ed1c-499b-92a9-bbda4534c1c7\") " Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.782934 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30556c0c-ed1c-499b-92a9-bbda4534c1c7-kube-api-access-vjz4q" (OuterVolumeSpecName: "kube-api-access-vjz4q") pod "30556c0c-ed1c-499b-92a9-bbda4534c1c7" (UID: "30556c0c-ed1c-499b-92a9-bbda4534c1c7"). InnerVolumeSpecName "kube-api-access-vjz4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.878379 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjz4q\" (UniqueName: \"kubernetes.io/projected/30556c0c-ed1c-499b-92a9-bbda4534c1c7-kube-api-access-vjz4q\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.881729 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.884358 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:46:04 crc kubenswrapper[4921]: I0318 13:46:04.897592 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.378065 4921 generic.go:334] "Generic (PLEG): container finished" podID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerID="1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42" exitCode=0 Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.378104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerDied","Data":"1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42"} Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.378157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerStarted","Data":"2a14c305d1179018d843b1b5f5375e707026708be18b4ad9b165db3eb38a582e"} Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.389832 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.391459 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-6mfj5" event={"ID":"30556c0c-ed1c-499b-92a9-bbda4534c1c7","Type":"ContainerDied","Data":"873260d83598eaa42db3535a57931ca91274d604cb8c63e86d8e14472b46e2f3"} Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.391498 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="873260d83598eaa42db3535a57931ca91274d604cb8c63e86d8e14472b46e2f3" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.405221 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.453728 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-2q92n"] Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.464224 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-2q92n"] Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.897282 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.897364 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.902343 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:46:05 crc kubenswrapper[4921]: I0318 13:46:05.902399 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.084467 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-4h28g"] Mar 18 13:46:06 crc kubenswrapper[4921]: E0318 13:46:06.084968 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30556c0c-ed1c-499b-92a9-bbda4534c1c7" containerName="oc" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.084992 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="30556c0c-ed1c-499b-92a9-bbda4534c1c7" containerName="oc" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.085267 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="30556c0c-ed1c-499b-92a9-bbda4534c1c7" containerName="oc" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.090863 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.097940 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-4h28g"] Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.206499 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-config\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.206571 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-nb\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.206642 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-sb\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.206691 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-dns-svc\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.206713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2v9\" (UniqueName: \"kubernetes.io/projected/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-kube-api-access-qb2v9\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.308304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-sb\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.308395 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-dns-svc\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.308423 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2v9\" (UniqueName: \"kubernetes.io/projected/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-kube-api-access-qb2v9\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.308583 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-config\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.308619 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-nb\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.309560 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-nb\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.310139 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-config\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.310790 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-sb\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.311033 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-dns-svc\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.329599 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2v9\" (UniqueName: \"kubernetes.io/projected/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-kube-api-access-qb2v9\") pod \"dnsmasq-dns-85c7886d8f-4h28g\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.398979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerStarted","Data":"f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096"} Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.422662 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:06 crc kubenswrapper[4921]: I0318 13:46:06.879340 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-4h28g"] Mar 18 13:46:07 crc kubenswrapper[4921]: I0318 13:46:07.221289 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd58bd96-9f87-4909-b9aa-2cb0fc619b6a" path="/var/lib/kubelet/pods/dd58bd96-9f87-4909-b9aa-2cb0fc619b6a/volumes" Mar 18 13:46:07 crc kubenswrapper[4921]: I0318 13:46:07.409881 4921 generic.go:334] "Generic (PLEG): container finished" podID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerID="f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096" exitCode=0 Mar 18 13:46:07 crc kubenswrapper[4921]: I0318 13:46:07.409941 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerDied","Data":"f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096"} Mar 18 13:46:07 crc kubenswrapper[4921]: I0318 13:46:07.412631 4921 generic.go:334] "Generic (PLEG): container finished" podID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerID="229264ac90f3e80c1f2a0f0efecb0e68130a530ab30cd8cf780431a2a905e396" exitCode=0 Mar 18 13:46:07 crc kubenswrapper[4921]: I0318 13:46:07.413534 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" event={"ID":"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4","Type":"ContainerDied","Data":"229264ac90f3e80c1f2a0f0efecb0e68130a530ab30cd8cf780431a2a905e396"} Mar 18 13:46:07 crc kubenswrapper[4921]: I0318 13:46:07.413563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" event={"ID":"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4","Type":"ContainerStarted","Data":"60aadb52963d359b536c78f1e6577f29c59b70c16c5083407920ce3dba96b47a"} Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.422996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerStarted","Data":"0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b"} Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.427345 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" event={"ID":"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4","Type":"ContainerStarted","Data":"da83f9de1dad7d2ceba32cf9d7667f1e82333087deec36e327b0dcb86ddda34e"} Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.427508 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.446513 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8tdrt" podStartSLOduration=2.884765848 podStartE2EDuration="5.446491371s" podCreationTimestamp="2026-03-18 13:46:03 +0000 UTC" firstStartedPulling="2026-03-18 13:46:05.379702589 +0000 UTC m=+5784.929623228" lastFinishedPulling="2026-03-18 13:46:07.941428102 +0000 UTC m=+5787.491348751" observedRunningTime="2026-03-18 13:46:08.440455909 +0000 UTC m=+5787.990376558" watchObservedRunningTime="2026-03-18 13:46:08.446491371 +0000 UTC m=+5787.996412010" Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.468648 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" podStartSLOduration=2.468627793 podStartE2EDuration="2.468627793s" podCreationTimestamp="2026-03-18 13:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:08.457716992 +0000 UTC m=+5788.007638031" watchObservedRunningTime="2026-03-18 13:46:08.468627793 +0000 UTC m=+5788.018548432" Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.742667 4921 scope.go:117] "RemoveContainer" containerID="b74bdd4f2d85a3fc5bd6360df807a43f16e16fd4720dc30fb44e82a94cd95aa3" Mar 18 13:46:08 crc kubenswrapper[4921]: I0318 13:46:08.806370 4921 scope.go:117] "RemoveContainer" containerID="9d505edab099638347c83892a468b29ddbeed91243f052b66bac3c2ef60263fa" Mar 18 13:46:13 crc kubenswrapper[4921]: I0318 13:46:13.981001 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:13 crc kubenswrapper[4921]: I0318 13:46:13.981495 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:15 crc kubenswrapper[4921]: I0318 13:46:15.029477 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8tdrt" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="registry-server" probeResult="failure" output=< Mar 18 13:46:15 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 13:46:15 crc kubenswrapper[4921]: > Mar 18 13:46:15 crc kubenswrapper[4921]: I0318 13:46:15.210206 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:46:15 crc kubenswrapper[4921]: E0318 13:46:15.210595 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:46:16 crc kubenswrapper[4921]: I0318 13:46:16.426723 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:16 crc kubenswrapper[4921]: I0318 13:46:16.503604 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-nvscx"] Mar 18 13:46:16 crc kubenswrapper[4921]: I0318 13:46:16.504608 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerName="dnsmasq-dns" containerID="cri-o://c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58" gracePeriod=10 Mar 18 13:46:16 crc kubenswrapper[4921]: I0318 13:46:16.982172 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.147049 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-dns-svc\") pod \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.147180 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-nb\") pod \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.147258 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-config\") pod \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.147288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-sb\") pod \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.147434 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf9nl\" (UniqueName: \"kubernetes.io/projected/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-kube-api-access-zf9nl\") pod \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\" (UID: \"4db0a9e2-b13b-44be-bfb3-68c6590e81b4\") " Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.153232 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-kube-api-access-zf9nl" (OuterVolumeSpecName: "kube-api-access-zf9nl") pod "4db0a9e2-b13b-44be-bfb3-68c6590e81b4" (UID: "4db0a9e2-b13b-44be-bfb3-68c6590e81b4"). InnerVolumeSpecName "kube-api-access-zf9nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.199355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4db0a9e2-b13b-44be-bfb3-68c6590e81b4" (UID: "4db0a9e2-b13b-44be-bfb3-68c6590e81b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.200038 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-config" (OuterVolumeSpecName: "config") pod "4db0a9e2-b13b-44be-bfb3-68c6590e81b4" (UID: "4db0a9e2-b13b-44be-bfb3-68c6590e81b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.202395 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4db0a9e2-b13b-44be-bfb3-68c6590e81b4" (UID: "4db0a9e2-b13b-44be-bfb3-68c6590e81b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.204022 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4db0a9e2-b13b-44be-bfb3-68c6590e81b4" (UID: "4db0a9e2-b13b-44be-bfb3-68c6590e81b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.250139 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.250969 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.250988 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.251001 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.251013 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf9nl\" (UniqueName: \"kubernetes.io/projected/4db0a9e2-b13b-44be-bfb3-68c6590e81b4-kube-api-access-zf9nl\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.506743 4921 generic.go:334] "Generic (PLEG): container finished" podID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerID="c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58" exitCode=0 Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.506787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" event={"ID":"4db0a9e2-b13b-44be-bfb3-68c6590e81b4","Type":"ContainerDied","Data":"c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58"} Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.506815 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" event={"ID":"4db0a9e2-b13b-44be-bfb3-68c6590e81b4","Type":"ContainerDied","Data":"1a7b488aa60f0ed77354f0403ca06c545fdac7a1b71ad4657e82932c4ad1ec2c"} Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.506831 4921 scope.go:117] "RemoveContainer" containerID="c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.506967 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dc7db885-nvscx" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.535205 4921 scope.go:117] "RemoveContainer" containerID="76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.542813 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-nvscx"] Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.552705 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69dc7db885-nvscx"] Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.562778 4921 scope.go:117] "RemoveContainer" containerID="c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58" Mar 18 13:46:17 crc kubenswrapper[4921]: E0318 13:46:17.563433 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58\": container with ID starting with c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58 not found: ID does not exist" containerID="c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.563466 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58"} err="failed to get container status \"c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58\": rpc error: code = NotFound desc = could not find container \"c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58\": container with ID starting with c506aa5c6dc360f92dc0508205baa658f3fffbd5a70fa115df89ec081fe69a58 not found: ID does not exist" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.563489 4921 scope.go:117] "RemoveContainer" containerID="76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7" Mar 18 13:46:17 crc kubenswrapper[4921]: E0318 13:46:17.564651 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7\": container with ID starting with 76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7 not found: ID does not exist" containerID="76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7" Mar 18 13:46:17 crc kubenswrapper[4921]: I0318 13:46:17.564677 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7"} err="failed to get container status \"76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7\": rpc error: code = NotFound desc = could not find container \"76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7\": container with ID starting with 76a1cbec88e5fa235231903bed38012d68b57e710c15c57118816c54becaaeb7 not found: ID does not exist" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.089448 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-fsljz"] Mar 18 13:46:19 crc kubenswrapper[4921]: E0318 13:46:19.089963 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerName="init" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.089983 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerName="init" Mar 18 13:46:19 crc kubenswrapper[4921]: E0318 13:46:19.090015 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerName="dnsmasq-dns" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.090024 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerName="dnsmasq-dns" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.090263 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" containerName="dnsmasq-dns" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.091036 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.099983 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-fsljz"] Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.188939 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3c32-account-create-update-dx6t7"] Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.190580 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.192535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.197816 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3c32-account-create-update-dx6t7"] Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.223247 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db0a9e2-b13b-44be-bfb3-68c6590e81b4" path="/var/lib/kubelet/pods/4db0a9e2-b13b-44be-bfb3-68c6590e81b4/volumes" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.289004 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skd6d\" (UniqueName: \"kubernetes.io/projected/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-kube-api-access-skd6d\") pod \"cinder-db-create-fsljz\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.289151 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmhv\" (UniqueName: \"kubernetes.io/projected/bc696a06-c0a8-4f71-814c-49242aab9b67-kube-api-access-twmhv\") pod \"cinder-3c32-account-create-update-dx6t7\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.289197 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc696a06-c0a8-4f71-814c-49242aab9b67-operator-scripts\") pod \"cinder-3c32-account-create-update-dx6t7\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.289293 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-operator-scripts\") pod \"cinder-db-create-fsljz\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.390682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc696a06-c0a8-4f71-814c-49242aab9b67-operator-scripts\") pod \"cinder-3c32-account-create-update-dx6t7\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.390818 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-operator-scripts\") pod \"cinder-db-create-fsljz\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.390867 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skd6d\" (UniqueName: \"kubernetes.io/projected/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-kube-api-access-skd6d\") pod \"cinder-db-create-fsljz\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.390925 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twmhv\" (UniqueName: \"kubernetes.io/projected/bc696a06-c0a8-4f71-814c-49242aab9b67-kube-api-access-twmhv\") pod \"cinder-3c32-account-create-update-dx6t7\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.391507 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc696a06-c0a8-4f71-814c-49242aab9b67-operator-scripts\") pod \"cinder-3c32-account-create-update-dx6t7\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.391922 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-operator-scripts\") pod \"cinder-db-create-fsljz\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.410777 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmhv\" (UniqueName: \"kubernetes.io/projected/bc696a06-c0a8-4f71-814c-49242aab9b67-kube-api-access-twmhv\") pod \"cinder-3c32-account-create-update-dx6t7\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.413267 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skd6d\" (UniqueName: \"kubernetes.io/projected/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-kube-api-access-skd6d\") pod \"cinder-db-create-fsljz\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.415301 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.508635 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:19 crc kubenswrapper[4921]: I0318 13:46:19.869366 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-fsljz"] Mar 18 13:46:19 crc kubenswrapper[4921]: W0318 13:46:19.871669 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3ff25_4ac8_4b4a_b608_a3904b2d4d8e.slice/crio-371423c5d2661ff58fbfa67a68e651ddcbd7459b9dba4023f4cfa89f78ad79b0 WatchSource:0}: Error finding container 371423c5d2661ff58fbfa67a68e651ddcbd7459b9dba4023f4cfa89f78ad79b0: Status 404 returned error can't find the container with id 371423c5d2661ff58fbfa67a68e651ddcbd7459b9dba4023f4cfa89f78ad79b0 Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.001222 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3c32-account-create-update-dx6t7"] Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.548199 4921 generic.go:334] "Generic (PLEG): container finished" podID="4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" containerID="3a84fa8eb23e5cf70baeb5004445d14940f01c62f37ac80017e0d26cb977986f" exitCode=0 Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.548254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-fsljz" event={"ID":"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e","Type":"ContainerDied","Data":"3a84fa8eb23e5cf70baeb5004445d14940f01c62f37ac80017e0d26cb977986f"} Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.549720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-fsljz" event={"ID":"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e","Type":"ContainerStarted","Data":"371423c5d2661ff58fbfa67a68e651ddcbd7459b9dba4023f4cfa89f78ad79b0"} Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.552491 4921 generic.go:334] "Generic (PLEG): container finished" podID="bc696a06-c0a8-4f71-814c-49242aab9b67" containerID="956b715c3d9db81be57c6842dd8e461a4552c74466ed489c5827da1000fee6b6" exitCode=0 Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.552542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3c32-account-create-update-dx6t7" event={"ID":"bc696a06-c0a8-4f71-814c-49242aab9b67","Type":"ContainerDied","Data":"956b715c3d9db81be57c6842dd8e461a4552c74466ed489c5827da1000fee6b6"} Mar 18 13:46:20 crc kubenswrapper[4921]: I0318 13:46:20.552589 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3c32-account-create-update-dx6t7" event={"ID":"bc696a06-c0a8-4f71-814c-49242aab9b67","Type":"ContainerStarted","Data":"8abd2cbc8deef214fffa905ebb60a11e2753edc43d267875887fff4fede0aa33"} Mar 18 13:46:21 crc kubenswrapper[4921]: I0318 13:46:21.993705 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:21 crc kubenswrapper[4921]: I0318 13:46:21.999738 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.136130 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-operator-scripts\") pod \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.136288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc696a06-c0a8-4f71-814c-49242aab9b67-operator-scripts\") pod \"bc696a06-c0a8-4f71-814c-49242aab9b67\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.136331 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twmhv\" (UniqueName: \"kubernetes.io/projected/bc696a06-c0a8-4f71-814c-49242aab9b67-kube-api-access-twmhv\") pod \"bc696a06-c0a8-4f71-814c-49242aab9b67\" (UID: \"bc696a06-c0a8-4f71-814c-49242aab9b67\") " Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.136418 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skd6d\" (UniqueName: \"kubernetes.io/projected/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-kube-api-access-skd6d\") pod \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\" (UID: \"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e\") " Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.137341 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc696a06-c0a8-4f71-814c-49242aab9b67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc696a06-c0a8-4f71-814c-49242aab9b67" (UID: "bc696a06-c0a8-4f71-814c-49242aab9b67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.138356 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" (UID: "4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.144785 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc696a06-c0a8-4f71-814c-49242aab9b67-kube-api-access-twmhv" (OuterVolumeSpecName: "kube-api-access-twmhv") pod "bc696a06-c0a8-4f71-814c-49242aab9b67" (UID: "bc696a06-c0a8-4f71-814c-49242aab9b67"). InnerVolumeSpecName "kube-api-access-twmhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.144899 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-kube-api-access-skd6d" (OuterVolumeSpecName: "kube-api-access-skd6d") pod "4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" (UID: "4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e"). InnerVolumeSpecName "kube-api-access-skd6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.238556 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.238608 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc696a06-c0a8-4f71-814c-49242aab9b67-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.238622 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twmhv\" (UniqueName: \"kubernetes.io/projected/bc696a06-c0a8-4f71-814c-49242aab9b67-kube-api-access-twmhv\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.238638 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skd6d\" (UniqueName: \"kubernetes.io/projected/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e-kube-api-access-skd6d\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.572813 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3c32-account-create-update-dx6t7" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.572811 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3c32-account-create-update-dx6t7" event={"ID":"bc696a06-c0a8-4f71-814c-49242aab9b67","Type":"ContainerDied","Data":"8abd2cbc8deef214fffa905ebb60a11e2753edc43d267875887fff4fede0aa33"} Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.573278 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8abd2cbc8deef214fffa905ebb60a11e2753edc43d267875887fff4fede0aa33" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.574678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-fsljz" event={"ID":"4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e","Type":"ContainerDied","Data":"371423c5d2661ff58fbfa67a68e651ddcbd7459b9dba4023f4cfa89f78ad79b0"} Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.574713 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="371423c5d2661ff58fbfa67a68e651ddcbd7459b9dba4023f4cfa89f78ad79b0" Mar 18 13:46:22 crc kubenswrapper[4921]: I0318 13:46:22.574777 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-fsljz" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.032520 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.085570 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.266043 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8tdrt"] Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.469985 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-hv9mx"] Mar 18 13:46:24 crc kubenswrapper[4921]: E0318 13:46:24.470451 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" containerName="mariadb-database-create" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.470472 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" containerName="mariadb-database-create" Mar 18 13:46:24 crc kubenswrapper[4921]: E0318 13:46:24.470493 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc696a06-c0a8-4f71-814c-49242aab9b67" containerName="mariadb-account-create-update" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.470500 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc696a06-c0a8-4f71-814c-49242aab9b67" containerName="mariadb-account-create-update" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.470698 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" containerName="mariadb-database-create" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.470721 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc696a06-c0a8-4f71-814c-49242aab9b67" containerName="mariadb-account-create-update" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.471350 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.478826 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.479031 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kgzh4" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.479265 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.488218 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hv9mx"] Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.590166 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-db-sync-config-data\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.590236 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-config-data\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.590267 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r22rg\" (UniqueName: \"kubernetes.io/projected/438dbf8f-1745-44e0-aadc-b8a6d07598cb-kube-api-access-r22rg\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.590331 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-combined-ca-bundle\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.590354 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/438dbf8f-1745-44e0-aadc-b8a6d07598cb-etc-machine-id\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.590378 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-scripts\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.692011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-db-sync-config-data\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.692090 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-config-data\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.692127 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r22rg\" (UniqueName: \"kubernetes.io/projected/438dbf8f-1745-44e0-aadc-b8a6d07598cb-kube-api-access-r22rg\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.692183 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-combined-ca-bundle\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.692201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/438dbf8f-1745-44e0-aadc-b8a6d07598cb-etc-machine-id\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.692224 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-scripts\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.693409 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/438dbf8f-1745-44e0-aadc-b8a6d07598cb-etc-machine-id\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.697888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-scripts\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.697970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-combined-ca-bundle\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.698323 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-db-sync-config-data\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.704137 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-config-data\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.709164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r22rg\" (UniqueName: \"kubernetes.io/projected/438dbf8f-1745-44e0-aadc-b8a6d07598cb-kube-api-access-r22rg\") pod \"cinder-db-sync-hv9mx\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:24 crc kubenswrapper[4921]: I0318 13:46:24.799640 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:25 crc kubenswrapper[4921]: I0318 13:46:25.256929 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hv9mx"] Mar 18 13:46:25 crc kubenswrapper[4921]: W0318 13:46:25.260797 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438dbf8f_1745_44e0_aadc_b8a6d07598cb.slice/crio-de968dca083c9f80ed05b1cef40ae323a2585e6147b72a9ef49db04ab332c62b WatchSource:0}: Error finding container de968dca083c9f80ed05b1cef40ae323a2585e6147b72a9ef49db04ab332c62b: Status 404 returned error can't find the container with id de968dca083c9f80ed05b1cef40ae323a2585e6147b72a9ef49db04ab332c62b Mar 18 13:46:25 crc kubenswrapper[4921]: I0318 13:46:25.600136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hv9mx" event={"ID":"438dbf8f-1745-44e0-aadc-b8a6d07598cb","Type":"ContainerStarted","Data":"de968dca083c9f80ed05b1cef40ae323a2585e6147b72a9ef49db04ab332c62b"} Mar 18 13:46:25 crc kubenswrapper[4921]: I0318 13:46:25.600314 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8tdrt" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="registry-server" containerID="cri-o://0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b" gracePeriod=2 Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.080884 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.232471 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-utilities\") pod \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.232548 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-catalog-content\") pod \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.232613 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxntr\" (UniqueName: \"kubernetes.io/projected/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-kube-api-access-kxntr\") pod \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\" (UID: \"b7f789c9-ba82-4790-8e5f-cd4a7afaa453\") " Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.233583 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-utilities" (OuterVolumeSpecName: "utilities") pod "b7f789c9-ba82-4790-8e5f-cd4a7afaa453" (UID: "b7f789c9-ba82-4790-8e5f-cd4a7afaa453"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.245626 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-kube-api-access-kxntr" (OuterVolumeSpecName: "kube-api-access-kxntr") pod "b7f789c9-ba82-4790-8e5f-cd4a7afaa453" (UID: "b7f789c9-ba82-4790-8e5f-cd4a7afaa453"). InnerVolumeSpecName "kube-api-access-kxntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.338746 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.339131 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxntr\" (UniqueName: \"kubernetes.io/projected/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-kube-api-access-kxntr\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.391474 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7f789c9-ba82-4790-8e5f-cd4a7afaa453" (UID: "b7f789c9-ba82-4790-8e5f-cd4a7afaa453"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.440933 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f789c9-ba82-4790-8e5f-cd4a7afaa453-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.610597 4921 generic.go:334] "Generic (PLEG): container finished" podID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerID="0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b" exitCode=0 Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.610665 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tdrt" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.610670 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerDied","Data":"0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b"} Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.610711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tdrt" event={"ID":"b7f789c9-ba82-4790-8e5f-cd4a7afaa453","Type":"ContainerDied","Data":"2a14c305d1179018d843b1b5f5375e707026708be18b4ad9b165db3eb38a582e"} Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.610726 4921 scope.go:117] "RemoveContainer" containerID="0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.614928 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hv9mx" event={"ID":"438dbf8f-1745-44e0-aadc-b8a6d07598cb","Type":"ContainerStarted","Data":"44bc585a50fed56ab23a48d188c2f506f1b04b37e336167779e738e2f343c302"} Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.642616 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-hv9mx" podStartSLOduration=2.642596182 podStartE2EDuration="2.642596182s" podCreationTimestamp="2026-03-18 13:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:26.638220347 +0000 UTC m=+5806.188141056" watchObservedRunningTime="2026-03-18 13:46:26.642596182 +0000 UTC m=+5806.192516821" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.664438 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8tdrt"] Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.666405 4921 scope.go:117] "RemoveContainer" containerID="f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.675347 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8tdrt"] Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.690536 4921 scope.go:117] "RemoveContainer" containerID="1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.724460 4921 scope.go:117] "RemoveContainer" containerID="0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b" Mar 18 13:46:26 crc kubenswrapper[4921]: E0318 13:46:26.725343 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b\": container with ID starting with 0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b not found: ID does not exist" containerID="0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.725385 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b"} err="failed to get container status \"0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b\": rpc error: code = NotFound desc = could not find container \"0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b\": container with ID starting with 0a51bb45dedde8f306d2e3d90546ae27804a78b9a8f00c667c03ee96a9146b3b not found: ID does not exist" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.725428 4921 scope.go:117] "RemoveContainer" containerID="f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096" Mar 18 13:46:26 crc kubenswrapper[4921]: E0318 13:46:26.726102 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096\": container with ID starting with f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096 not found: ID does not exist" containerID="f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.726164 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096"} err="failed to get container status \"f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096\": rpc error: code = NotFound desc = could not find container \"f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096\": container with ID starting with f7191550ee0de31982b752986671318720af4c8774aecf7f9e0a0ddf431e9096 not found: ID does not exist" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.726201 4921 scope.go:117] "RemoveContainer" containerID="1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42" Mar 18 13:46:26 crc kubenswrapper[4921]: E0318 13:46:26.726550 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42\": container with ID starting with 1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42 not found: ID does not exist" containerID="1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42" Mar 18 13:46:26 crc kubenswrapper[4921]: I0318 13:46:26.726575 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42"} err="failed to get container status \"1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42\": rpc error: code = NotFound desc = could not find container \"1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42\": container with ID starting with 1e67f096b0ee2d9061c00195ec4fb7d41f7a0216d0d4a8971476087f34bb3a42 not found: ID does not exist" Mar 18 13:46:27 crc kubenswrapper[4921]: I0318 13:46:27.221577 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" path="/var/lib/kubelet/pods/b7f789c9-ba82-4790-8e5f-cd4a7afaa453/volumes" Mar 18 13:46:29 crc kubenswrapper[4921]: I0318 13:46:29.647249 4921 generic.go:334] "Generic (PLEG): container finished" podID="438dbf8f-1745-44e0-aadc-b8a6d07598cb" containerID="44bc585a50fed56ab23a48d188c2f506f1b04b37e336167779e738e2f343c302" exitCode=0 Mar 18 13:46:29 crc kubenswrapper[4921]: I0318 13:46:29.647322 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hv9mx" event={"ID":"438dbf8f-1745-44e0-aadc-b8a6d07598cb","Type":"ContainerDied","Data":"44bc585a50fed56ab23a48d188c2f506f1b04b37e336167779e738e2f343c302"} Mar 18 13:46:30 crc kubenswrapper[4921]: I0318 13:46:30.209503 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:46:30 crc kubenswrapper[4921]: E0318 13:46:30.209730 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.011425 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.134073 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-config-data\") pod \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.134166 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-db-sync-config-data\") pod \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.134221 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-scripts\") pod \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.134252 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r22rg\" (UniqueName: \"kubernetes.io/projected/438dbf8f-1745-44e0-aadc-b8a6d07598cb-kube-api-access-r22rg\") pod \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.134371 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/438dbf8f-1745-44e0-aadc-b8a6d07598cb-etc-machine-id\") pod \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.134436 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-combined-ca-bundle\") pod \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\" (UID: \"438dbf8f-1745-44e0-aadc-b8a6d07598cb\") " Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.135076 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/438dbf8f-1745-44e0-aadc-b8a6d07598cb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "438dbf8f-1745-44e0-aadc-b8a6d07598cb" (UID: "438dbf8f-1745-44e0-aadc-b8a6d07598cb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.140971 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438dbf8f-1745-44e0-aadc-b8a6d07598cb-kube-api-access-r22rg" (OuterVolumeSpecName: "kube-api-access-r22rg") pod "438dbf8f-1745-44e0-aadc-b8a6d07598cb" (UID: "438dbf8f-1745-44e0-aadc-b8a6d07598cb"). InnerVolumeSpecName "kube-api-access-r22rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.141223 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-scripts" (OuterVolumeSpecName: "scripts") pod "438dbf8f-1745-44e0-aadc-b8a6d07598cb" (UID: "438dbf8f-1745-44e0-aadc-b8a6d07598cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.141212 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "438dbf8f-1745-44e0-aadc-b8a6d07598cb" (UID: "438dbf8f-1745-44e0-aadc-b8a6d07598cb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.167163 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438dbf8f-1745-44e0-aadc-b8a6d07598cb" (UID: "438dbf8f-1745-44e0-aadc-b8a6d07598cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.182575 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-config-data" (OuterVolumeSpecName: "config-data") pod "438dbf8f-1745-44e0-aadc-b8a6d07598cb" (UID: "438dbf8f-1745-44e0-aadc-b8a6d07598cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.241261 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/438dbf8f-1745-44e0-aadc-b8a6d07598cb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.241310 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.241335 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.241350 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.241364 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/438dbf8f-1745-44e0-aadc-b8a6d07598cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.241379 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r22rg\" (UniqueName: \"kubernetes.io/projected/438dbf8f-1745-44e0-aadc-b8a6d07598cb-kube-api-access-r22rg\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.673228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hv9mx" event={"ID":"438dbf8f-1745-44e0-aadc-b8a6d07598cb","Type":"ContainerDied","Data":"de968dca083c9f80ed05b1cef40ae323a2585e6147b72a9ef49db04ab332c62b"} Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.673322 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de968dca083c9f80ed05b1cef40ae323a2585e6147b72a9ef49db04ab332c62b" Mar 18 13:46:31 crc kubenswrapper[4921]: I0318 13:46:31.673413 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hv9mx" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.053908 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-6ptd4"] Mar 18 13:46:32 crc kubenswrapper[4921]: E0318 13:46:32.054424 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="registry-server" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.054443 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="registry-server" Mar 18 13:46:32 crc kubenswrapper[4921]: E0318 13:46:32.054453 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="extract-content" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.054461 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="extract-content" Mar 18 13:46:32 crc kubenswrapper[4921]: E0318 13:46:32.054484 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438dbf8f-1745-44e0-aadc-b8a6d07598cb" containerName="cinder-db-sync" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.054491 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="438dbf8f-1745-44e0-aadc-b8a6d07598cb" containerName="cinder-db-sync" Mar 18 13:46:32 crc kubenswrapper[4921]: E0318 13:46:32.054528 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="extract-utilities" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.054535 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="extract-utilities" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.054758 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f789c9-ba82-4790-8e5f-cd4a7afaa453" containerName="registry-server" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.054785 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="438dbf8f-1745-44e0-aadc-b8a6d07598cb" containerName="cinder-db-sync" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.055951 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.067265 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-6ptd4"] Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.159278 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-nb\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.159642 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-dns-svc\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.159850 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-config\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.159915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lffg\" (UniqueName: \"kubernetes.io/projected/d95b901e-6bff-439c-b1fb-75ff8939262a-kube-api-access-8lffg\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.160146 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-sb\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.264605 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-dns-svc\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.262817 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-dns-svc\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.265043 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-config\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.269179 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lffg\" (UniqueName: \"kubernetes.io/projected/d95b901e-6bff-439c-b1fb-75ff8939262a-kube-api-access-8lffg\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.269341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-sb\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.269587 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-nb\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.270625 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-nb\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.271259 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-config\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.272078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-sb\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.304827 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lffg\" (UniqueName: \"kubernetes.io/projected/d95b901e-6bff-439c-b1fb-75ff8939262a-kube-api-access-8lffg\") pod \"dnsmasq-dns-7784748f7f-6ptd4\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.357170 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.358648 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.361062 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.361557 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kgzh4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.362017 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.362199 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.390720 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.413568 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.474948 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.475190 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18d6390e-8783-44aa-8a07-9b522fb9a4d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.475312 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-scripts\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.475541 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.475839 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh5lh\" (UniqueName: \"kubernetes.io/projected/18d6390e-8783-44aa-8a07-9b522fb9a4d2-kube-api-access-jh5lh\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.475989 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.476148 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18d6390e-8783-44aa-8a07-9b522fb9a4d2-logs\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.594613 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18d6390e-8783-44aa-8a07-9b522fb9a4d2-logs\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.595074 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.595139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18d6390e-8783-44aa-8a07-9b522fb9a4d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.595201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-scripts\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.595221 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.595267 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh5lh\" (UniqueName: \"kubernetes.io/projected/18d6390e-8783-44aa-8a07-9b522fb9a4d2-kube-api-access-jh5lh\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.595291 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.596413 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18d6390e-8783-44aa-8a07-9b522fb9a4d2-logs\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.598171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18d6390e-8783-44aa-8a07-9b522fb9a4d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.614997 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.616583 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.620540 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.624962 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-scripts\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.635346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh5lh\" (UniqueName: \"kubernetes.io/projected/18d6390e-8783-44aa-8a07-9b522fb9a4d2-kube-api-access-jh5lh\") pod \"cinder-api-0\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " pod="openstack/cinder-api-0" Mar 18 13:46:32 crc kubenswrapper[4921]: I0318 13:46:32.706457 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:46:33 crc kubenswrapper[4921]: I0318 13:46:33.023033 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-6ptd4"] Mar 18 13:46:33 crc kubenswrapper[4921]: W0318 13:46:33.215200 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d6390e_8783_44aa_8a07_9b522fb9a4d2.slice/crio-f65ea92f0ad6fc0b968c5c13f3481ea5ed88179795e6b11c618ade68b40abf5b WatchSource:0}: Error finding container f65ea92f0ad6fc0b968c5c13f3481ea5ed88179795e6b11c618ade68b40abf5b: Status 404 returned error can't find the container with id f65ea92f0ad6fc0b968c5c13f3481ea5ed88179795e6b11c618ade68b40abf5b Mar 18 13:46:33 crc kubenswrapper[4921]: I0318 13:46:33.224098 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:46:33 crc kubenswrapper[4921]: I0318 13:46:33.698647 4921 generic.go:334] "Generic (PLEG): container finished" podID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerID="0b99892b1d3e3450c6803a159ccac3e7b8f18dd5d2e90fa9ca36890901e5b987" exitCode=0 Mar 18 13:46:33 crc kubenswrapper[4921]: I0318 13:46:33.699465 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" event={"ID":"d95b901e-6bff-439c-b1fb-75ff8939262a","Type":"ContainerDied","Data":"0b99892b1d3e3450c6803a159ccac3e7b8f18dd5d2e90fa9ca36890901e5b987"} Mar 18 13:46:33 crc kubenswrapper[4921]: I0318 13:46:33.699514 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" event={"ID":"d95b901e-6bff-439c-b1fb-75ff8939262a","Type":"ContainerStarted","Data":"c7199ce81394c61aabffac796c7687a9694f01de2597bd53c19d31670f293418"} Mar 18 13:46:33 crc kubenswrapper[4921]: I0318 13:46:33.703820 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18d6390e-8783-44aa-8a07-9b522fb9a4d2","Type":"ContainerStarted","Data":"f65ea92f0ad6fc0b968c5c13f3481ea5ed88179795e6b11c618ade68b40abf5b"} Mar 18 13:46:34 crc kubenswrapper[4921]: I0318 13:46:34.720157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" event={"ID":"d95b901e-6bff-439c-b1fb-75ff8939262a","Type":"ContainerStarted","Data":"a18f118f91380fc2fea0bbb5c3cf426cc0271e62c033562955ecf8180e67d122"} Mar 18 13:46:34 crc kubenswrapper[4921]: I0318 13:46:34.720662 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:34 crc kubenswrapper[4921]: I0318 13:46:34.722050 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18d6390e-8783-44aa-8a07-9b522fb9a4d2","Type":"ContainerStarted","Data":"eced214ead4bcf784250d418e904c2456bbb41573533354b3ae6e720199dd2d0"} Mar 18 13:46:34 crc kubenswrapper[4921]: I0318 13:46:34.722095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18d6390e-8783-44aa-8a07-9b522fb9a4d2","Type":"ContainerStarted","Data":"7af8adaf6f105cace9e678f60dd91ed48a480a68408b9799e3ce3aa5ff9a366c"} Mar 18 13:46:34 crc kubenswrapper[4921]: I0318 13:46:34.753868 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" podStartSLOduration=3.753844596 podStartE2EDuration="3.753844596s" podCreationTimestamp="2026-03-18 13:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:34.751405066 +0000 UTC m=+5814.301325715" watchObservedRunningTime="2026-03-18 13:46:34.753844596 +0000 UTC m=+5814.303765235" Mar 18 13:46:34 crc kubenswrapper[4921]: I0318 13:46:34.804057 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.804035009 podStartE2EDuration="2.804035009s" podCreationTimestamp="2026-03-18 13:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:34.777041788 +0000 UTC m=+5814.326962427" watchObservedRunningTime="2026-03-18 13:46:34.804035009 +0000 UTC m=+5814.353955648" Mar 18 13:46:35 crc kubenswrapper[4921]: I0318 13:46:35.731591 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 13:46:42 crc kubenswrapper[4921]: I0318 13:46:42.415322 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:46:42 crc kubenswrapper[4921]: I0318 13:46:42.481415 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-4h28g"] Mar 18 13:46:42 crc kubenswrapper[4921]: I0318 13:46:42.481684 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerName="dnsmasq-dns" containerID="cri-o://da83f9de1dad7d2ceba32cf9d7667f1e82333087deec36e327b0dcb86ddda34e" gracePeriod=10 Mar 18 13:46:42 crc kubenswrapper[4921]: I0318 13:46:42.814758 4921 generic.go:334] "Generic (PLEG): container finished" podID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerID="da83f9de1dad7d2ceba32cf9d7667f1e82333087deec36e327b0dcb86ddda34e" exitCode=0 Mar 18 13:46:42 crc kubenswrapper[4921]: I0318 13:46:42.814813 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" event={"ID":"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4","Type":"ContainerDied","Data":"da83f9de1dad7d2ceba32cf9d7667f1e82333087deec36e327b0dcb86ddda34e"} Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.049344 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.131431 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-nb\") pod \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.131509 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-sb\") pod \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.131644 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-config\") pod \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.131676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-dns-svc\") pod \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.131711 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb2v9\" (UniqueName: \"kubernetes.io/projected/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-kube-api-access-qb2v9\") pod \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\" (UID: \"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4\") " Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.138362 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-kube-api-access-qb2v9" (OuterVolumeSpecName: "kube-api-access-qb2v9") pod "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" (UID: "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4"). InnerVolumeSpecName "kube-api-access-qb2v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.189357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" (UID: "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.200342 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" (UID: "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.202732 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-config" (OuterVolumeSpecName: "config") pod "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" (UID: "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.220184 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" (UID: "cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.233435 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.233465 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.233480 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb2v9\" (UniqueName: \"kubernetes.io/projected/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-kube-api-access-qb2v9\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.233491 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.233500 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.839138 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" event={"ID":"cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4","Type":"ContainerDied","Data":"60aadb52963d359b536c78f1e6577f29c59b70c16c5083407920ce3dba96b47a"} Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.839450 4921 scope.go:117] "RemoveContainer" containerID="da83f9de1dad7d2ceba32cf9d7667f1e82333087deec36e327b0dcb86ddda34e" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.839586 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85c7886d8f-4h28g" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.881459 4921 scope.go:117] "RemoveContainer" containerID="229264ac90f3e80c1f2a0f0efecb0e68130a530ab30cd8cf780431a2a905e396" Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.882083 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-4h28g"] Mar 18 13:46:43 crc kubenswrapper[4921]: I0318 13:46:43.888925 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85c7886d8f-4h28g"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.589253 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.589508 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-log" containerID="cri-o://f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.589615 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-api" containerID="cri-o://d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.620139 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.620400 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" containerName="nova-scheduler-scheduler" containerID="cri-o://62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.638866 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.639190 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="396d2df9-ddb7-4514-a6e2-991b6c410448" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.657711 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.658006 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0d7160e1-8c76-4e45-b7b9-68556e95db42" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.674704 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.675088 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-log" containerID="cri-o://965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.675103 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-metadata" containerID="cri-o://b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.686779 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.687016 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f55371a6-079f-4c8b-8460-b330cdc72ff6" containerName="nova-cell1-conductor-conductor" containerID="cri-o://bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b" gracePeriod=30 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.853088 4921 generic.go:334] "Generic (PLEG): container finished" podID="9916a791-4604-4652-9dd1-354d91186046" containerID="965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39" exitCode=143 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.853150 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9916a791-4604-4652-9dd1-354d91186046","Type":"ContainerDied","Data":"965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39"} Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.855558 4921 generic.go:334] "Generic (PLEG): container finished" podID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerID="f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606" exitCode=143 Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.855609 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c998b6ad-897c-4d19-9216-b6950058a3e2","Type":"ContainerDied","Data":"f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606"} Mar 18 13:46:44 crc kubenswrapper[4921]: I0318 13:46:44.911508 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.209246 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:46:45 crc kubenswrapper[4921]: E0318 13:46:45.209526 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.227126 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" path="/var/lib/kubelet/pods/cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4/volumes" Mar 18 13:46:45 crc kubenswrapper[4921]: E0318 13:46:45.623390 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:46:45 crc kubenswrapper[4921]: E0318 13:46:45.626057 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:46:45 crc kubenswrapper[4921]: E0318 13:46:45.627250 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:46:45 crc kubenswrapper[4921]: E0318 13:46:45.627286 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" containerName="nova-scheduler-scheduler" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.763536 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.881201 4921 generic.go:334] "Generic (PLEG): container finished" podID="0d7160e1-8c76-4e45-b7b9-68556e95db42" containerID="8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889" exitCode=0 Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.881302 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.881307 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d7160e1-8c76-4e45-b7b9-68556e95db42","Type":"ContainerDied","Data":"8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889"} Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.881899 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d7160e1-8c76-4e45-b7b9-68556e95db42","Type":"ContainerDied","Data":"778a49e7bd88d4e0c715025b9e9ef536988b3ccf37cc166299492bb2a6c42011"} Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.881938 4921 scope.go:117] "RemoveContainer" containerID="8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.885074 4921 generic.go:334] "Generic (PLEG): container finished" podID="f55371a6-079f-4c8b-8460-b330cdc72ff6" containerID="bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b" exitCode=0 Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.885166 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f55371a6-079f-4c8b-8460-b330cdc72ff6","Type":"ContainerDied","Data":"bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b"} Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.892576 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-combined-ca-bundle\") pod \"0d7160e1-8c76-4e45-b7b9-68556e95db42\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.892683 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrrzg\" (UniqueName: \"kubernetes.io/projected/0d7160e1-8c76-4e45-b7b9-68556e95db42-kube-api-access-jrrzg\") pod \"0d7160e1-8c76-4e45-b7b9-68556e95db42\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.892792 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-config-data\") pod \"0d7160e1-8c76-4e45-b7b9-68556e95db42\" (UID: \"0d7160e1-8c76-4e45-b7b9-68556e95db42\") " Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.902424 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7160e1-8c76-4e45-b7b9-68556e95db42-kube-api-access-jrrzg" (OuterVolumeSpecName: "kube-api-access-jrrzg") pod "0d7160e1-8c76-4e45-b7b9-68556e95db42" (UID: "0d7160e1-8c76-4e45-b7b9-68556e95db42"). InnerVolumeSpecName "kube-api-access-jrrzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.914272 4921 scope.go:117] "RemoveContainer" containerID="8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889" Mar 18 13:46:45 crc kubenswrapper[4921]: E0318 13:46:45.914937 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889\": container with ID starting with 8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889 not found: ID does not exist" containerID="8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.914972 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889"} err="failed to get container status \"8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889\": rpc error: code = NotFound desc = could not find container \"8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889\": container with ID starting with 8037281ffefb50df4f49eb54ec69eafbf7208e50465e1eccc121253cc02d5889 not found: ID does not exist" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.919560 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d7160e1-8c76-4e45-b7b9-68556e95db42" (UID: "0d7160e1-8c76-4e45-b7b9-68556e95db42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.940671 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-config-data" (OuterVolumeSpecName: "config-data") pod "0d7160e1-8c76-4e45-b7b9-68556e95db42" (UID: "0d7160e1-8c76-4e45-b7b9-68556e95db42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.994802 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.994862 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7160e1-8c76-4e45-b7b9-68556e95db42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:45 crc kubenswrapper[4921]: I0318 13:46:45.994876 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrrzg\" (UniqueName: \"kubernetes.io/projected/0d7160e1-8c76-4e45-b7b9-68556e95db42-kube-api-access-jrrzg\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.222690 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.230141 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.255811 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b is running failed: container process not found" containerID="bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.256281 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b is running failed: container process not found" containerID="bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.256756 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b is running failed: container process not found" containerID="bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.256790 4921 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f55371a6-079f-4c8b-8460-b330cdc72ff6" containerName="nova-cell1-conductor-conductor" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.258832 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.259735 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7160e1-8c76-4e45-b7b9-68556e95db42" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.259865 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7160e1-8c76-4e45-b7b9-68556e95db42" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.260009 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerName="init" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.260087 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerName="init" Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.260192 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerName="dnsmasq-dns" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.260264 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerName="dnsmasq-dns" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.260621 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7160e1-8c76-4e45-b7b9-68556e95db42" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.260770 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd97f08d-9b0d-4038-bdf2-d66c8b7d5be4" containerName="dnsmasq-dns" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.261848 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.269987 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.273026 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.353178 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.403792 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-combined-ca-bundle\") pod \"f55371a6-079f-4c8b-8460-b330cdc72ff6\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.403871 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnk76\" (UniqueName: \"kubernetes.io/projected/f55371a6-079f-4c8b-8460-b330cdc72ff6-kube-api-access-qnk76\") pod \"f55371a6-079f-4c8b-8460-b330cdc72ff6\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.404087 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-config-data\") pod \"f55371a6-079f-4c8b-8460-b330cdc72ff6\" (UID: \"f55371a6-079f-4c8b-8460-b330cdc72ff6\") " Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.404398 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcbff36-a111-4270-988d-1fbb923c2f47-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.404445 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8w69\" (UniqueName: \"kubernetes.io/projected/ebcbff36-a111-4270-988d-1fbb923c2f47-kube-api-access-t8w69\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.404492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebcbff36-a111-4270-988d-1fbb923c2f47-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.408668 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55371a6-079f-4c8b-8460-b330cdc72ff6-kube-api-access-qnk76" (OuterVolumeSpecName: "kube-api-access-qnk76") pod "f55371a6-079f-4c8b-8460-b330cdc72ff6" (UID: "f55371a6-079f-4c8b-8460-b330cdc72ff6"). InnerVolumeSpecName "kube-api-access-qnk76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.438820 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f55371a6-079f-4c8b-8460-b330cdc72ff6" (UID: "f55371a6-079f-4c8b-8460-b330cdc72ff6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.439314 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-config-data" (OuterVolumeSpecName: "config-data") pod "f55371a6-079f-4c8b-8460-b330cdc72ff6" (UID: "f55371a6-079f-4c8b-8460-b330cdc72ff6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.505973 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebcbff36-a111-4270-988d-1fbb923c2f47-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.506158 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcbff36-a111-4270-988d-1fbb923c2f47-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.506214 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8w69\" (UniqueName: \"kubernetes.io/projected/ebcbff36-a111-4270-988d-1fbb923c2f47-kube-api-access-t8w69\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.506305 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.506323 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55371a6-079f-4c8b-8460-b330cdc72ff6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.506337 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnk76\" (UniqueName: \"kubernetes.io/projected/f55371a6-079f-4c8b-8460-b330cdc72ff6-kube-api-access-qnk76\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.510072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcbff36-a111-4270-988d-1fbb923c2f47-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.510077 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebcbff36-a111-4270-988d-1fbb923c2f47-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.526206 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8w69\" (UniqueName: \"kubernetes.io/projected/ebcbff36-a111-4270-988d-1fbb923c2f47-kube-api-access-t8w69\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebcbff36-a111-4270-988d-1fbb923c2f47\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.642297 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.899683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f55371a6-079f-4c8b-8460-b330cdc72ff6","Type":"ContainerDied","Data":"be016592d57e77d23cc34c8625a049106cce3325f9edce4c2261a7c297b925ba"} Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.900006 4921 scope.go:117] "RemoveContainer" containerID="bcce8509a0fb956eeb1b4f6eb823373180346eecafcd27b2eaf60d7ac3f45a8b" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.899943 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.953128 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.971074 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.982398 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:46:46 crc kubenswrapper[4921]: E0318 13:46:46.982831 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55371a6-079f-4c8b-8460-b330cdc72ff6" containerName="nova-cell1-conductor-conductor" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.982851 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55371a6-079f-4c8b-8460-b330cdc72ff6" containerName="nova-cell1-conductor-conductor" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.983022 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55371a6-079f-4c8b-8460-b330cdc72ff6" containerName="nova-cell1-conductor-conductor" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.983703 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:46 crc kubenswrapper[4921]: I0318 13:46:46.986299 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.001061 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:46:47 crc kubenswrapper[4921]: E0318 13:46:47.020638 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 13:46:47 crc kubenswrapper[4921]: E0318 13:46:47.022051 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 13:46:47 crc kubenswrapper[4921]: E0318 13:46:47.024123 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 13:46:47 crc kubenswrapper[4921]: E0318 13:46:47.024228 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="396d2df9-ddb7-4514-a6e2-991b6c410448" containerName="nova-cell0-conductor-conductor" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.117684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f778c5-fcbe-4592-a869-e6ded2907395-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.117767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs2dh\" (UniqueName: \"kubernetes.io/projected/a0f778c5-fcbe-4592-a869-e6ded2907395-kube-api-access-fs2dh\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.117819 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f778c5-fcbe-4592-a869-e6ded2907395-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.128764 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.223488 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f778c5-fcbe-4592-a869-e6ded2907395-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.223763 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs2dh\" (UniqueName: \"kubernetes.io/projected/a0f778c5-fcbe-4592-a869-e6ded2907395-kube-api-access-fs2dh\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.223819 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f778c5-fcbe-4592-a869-e6ded2907395-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.228215 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7160e1-8c76-4e45-b7b9-68556e95db42" path="/var/lib/kubelet/pods/0d7160e1-8c76-4e45-b7b9-68556e95db42/volumes" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.230194 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55371a6-079f-4c8b-8460-b330cdc72ff6" path="/var/lib/kubelet/pods/f55371a6-079f-4c8b-8460-b330cdc72ff6/volumes" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.234264 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f778c5-fcbe-4592-a869-e6ded2907395-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.236526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f778c5-fcbe-4592-a869-e6ded2907395-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.246206 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs2dh\" (UniqueName: \"kubernetes.io/projected/a0f778c5-fcbe-4592-a869-e6ded2907395-kube-api-access-fs2dh\") pod \"nova-cell1-conductor-0\" (UID: \"a0f778c5-fcbe-4592-a869-e6ded2907395\") " pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.302351 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.803342 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.946077 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ebcbff36-a111-4270-988d-1fbb923c2f47","Type":"ContainerStarted","Data":"4623acd8e660dd204eaff8f331bf090ab56b01ca7639aa441f15f5326418d02f"} Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.946146 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ebcbff36-a111-4270-988d-1fbb923c2f47","Type":"ContainerStarted","Data":"6ce4a3413b4fcfb4ddb37f1b71b6c7855770f397a9bddeb5b89b08eff6065124"} Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.955913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a0f778c5-fcbe-4592-a869-e6ded2907395","Type":"ContainerStarted","Data":"7a9f37709b806917f36685797b1523e3d645a13d413e75852711f7726227d4b1"} Mar 18 13:46:47 crc kubenswrapper[4921]: I0318 13:46:47.972184 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.9721643279999999 podStartE2EDuration="1.972164328s" podCreationTimestamp="2026-03-18 13:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:47.967403472 +0000 UTC m=+5827.517324111" watchObservedRunningTime="2026-03-18 13:46:47.972164328 +0000 UTC m=+5827.522084967" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.483980 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.554978 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-combined-ca-bundle\") pod \"c998b6ad-897c-4d19-9216-b6950058a3e2\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.555034 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pm8q\" (UniqueName: \"kubernetes.io/projected/c998b6ad-897c-4d19-9216-b6950058a3e2-kube-api-access-8pm8q\") pod \"c998b6ad-897c-4d19-9216-b6950058a3e2\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.555318 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c998b6ad-897c-4d19-9216-b6950058a3e2-logs\") pod \"c998b6ad-897c-4d19-9216-b6950058a3e2\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.555353 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-config-data\") pod \"c998b6ad-897c-4d19-9216-b6950058a3e2\" (UID: \"c998b6ad-897c-4d19-9216-b6950058a3e2\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.557167 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c998b6ad-897c-4d19-9216-b6950058a3e2-logs" (OuterVolumeSpecName: "logs") pod "c998b6ad-897c-4d19-9216-b6950058a3e2" (UID: "c998b6ad-897c-4d19-9216-b6950058a3e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.569821 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c998b6ad-897c-4d19-9216-b6950058a3e2-kube-api-access-8pm8q" (OuterVolumeSpecName: "kube-api-access-8pm8q") pod "c998b6ad-897c-4d19-9216-b6950058a3e2" (UID: "c998b6ad-897c-4d19-9216-b6950058a3e2"). InnerVolumeSpecName "kube-api-access-8pm8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.611263 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-config-data" (OuterVolumeSpecName: "config-data") pod "c998b6ad-897c-4d19-9216-b6950058a3e2" (UID: "c998b6ad-897c-4d19-9216-b6950058a3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.611656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c998b6ad-897c-4d19-9216-b6950058a3e2" (UID: "c998b6ad-897c-4d19-9216-b6950058a3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.642420 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.658422 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.658485 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c998b6ad-897c-4d19-9216-b6950058a3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.658505 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pm8q\" (UniqueName: \"kubernetes.io/projected/c998b6ad-897c-4d19-9216-b6950058a3e2-kube-api-access-8pm8q\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.658519 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c998b6ad-897c-4d19-9216-b6950058a3e2-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.759969 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916a791-4604-4652-9dd1-354d91186046-logs\") pod \"9916a791-4604-4652-9dd1-354d91186046\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.760039 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-combined-ca-bundle\") pod \"9916a791-4604-4652-9dd1-354d91186046\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.760100 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-config-data\") pod \"9916a791-4604-4652-9dd1-354d91186046\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.760602 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4588\" (UniqueName: \"kubernetes.io/projected/9916a791-4604-4652-9dd1-354d91186046-kube-api-access-t4588\") pod \"9916a791-4604-4652-9dd1-354d91186046\" (UID: \"9916a791-4604-4652-9dd1-354d91186046\") " Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.761153 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9916a791-4604-4652-9dd1-354d91186046-logs" (OuterVolumeSpecName: "logs") pod "9916a791-4604-4652-9dd1-354d91186046" (UID: "9916a791-4604-4652-9dd1-354d91186046"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.777207 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9916a791-4604-4652-9dd1-354d91186046-kube-api-access-t4588" (OuterVolumeSpecName: "kube-api-access-t4588") pod "9916a791-4604-4652-9dd1-354d91186046" (UID: "9916a791-4604-4652-9dd1-354d91186046"). InnerVolumeSpecName "kube-api-access-t4588". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.799330 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9916a791-4604-4652-9dd1-354d91186046" (UID: "9916a791-4604-4652-9dd1-354d91186046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.820262 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-config-data" (OuterVolumeSpecName: "config-data") pod "9916a791-4604-4652-9dd1-354d91186046" (UID: "9916a791-4604-4652-9dd1-354d91186046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.863019 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916a791-4604-4652-9dd1-354d91186046-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.863059 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.863070 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916a791-4604-4652-9dd1-354d91186046-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.863079 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4588\" (UniqueName: \"kubernetes.io/projected/9916a791-4604-4652-9dd1-354d91186046-kube-api-access-t4588\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.974588 4921 generic.go:334] "Generic (PLEG): container finished" podID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerID="d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a" exitCode=0 Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.974701 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c998b6ad-897c-4d19-9216-b6950058a3e2","Type":"ContainerDied","Data":"d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a"} Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.974737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c998b6ad-897c-4d19-9216-b6950058a3e2","Type":"ContainerDied","Data":"302945e096aa37fe658c4346c88e57a28400a8a391c90bfce2fb42af9fca7f54"} Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.974778 4921 scope.go:117] "RemoveContainer" containerID="d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.974950 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.988825 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a0f778c5-fcbe-4592-a869-e6ded2907395","Type":"ContainerStarted","Data":"fbd926359a75d014b30b018014e8f1b646c29d556232a211db4a3aacf4fe3d90"} Mar 18 13:46:48 crc kubenswrapper[4921]: I0318 13:46:48.989423 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.000020 4921 generic.go:334] "Generic (PLEG): container finished" podID="9916a791-4604-4652-9dd1-354d91186046" containerID="b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828" exitCode=0 Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.000126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9916a791-4604-4652-9dd1-354d91186046","Type":"ContainerDied","Data":"b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828"} Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.000178 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9916a791-4604-4652-9dd1-354d91186046","Type":"ContainerDied","Data":"70257accb71cb34e4fbe6a43b734515ac3ba0b4c320c4e2a4ae3e06768e4fe98"} Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.002700 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.009641 4921 scope.go:117] "RemoveContainer" containerID="f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.014797 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.014775593 podStartE2EDuration="3.014775593s" podCreationTimestamp="2026-03-18 13:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:49.007412393 +0000 UTC m=+5828.557333042" watchObservedRunningTime="2026-03-18 13:46:49.014775593 +0000 UTC m=+5828.564696242" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.076924 4921 scope.go:117] "RemoveContainer" containerID="d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.077897 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a\": container with ID starting with d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a not found: ID does not exist" containerID="d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.077959 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a"} err="failed to get container status \"d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a\": rpc error: code = NotFound desc = could not find container \"d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a\": container with ID starting with d4b61667534b340bb248bb5ef7f828d60866429f5a8a0c4f9946d097ad10359a not found: ID does not exist" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.077989 4921 scope.go:117] "RemoveContainer" containerID="f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.078786 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606\": container with ID starting with f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606 not found: ID does not exist" containerID="f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.078947 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606"} err="failed to get container status \"f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606\": rpc error: code = NotFound desc = could not find container \"f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606\": container with ID starting with f4ddc8f768f3fe1db8e8c9e3889472d0a17d478a11445b4f8be43b60c295c606 not found: ID does not exist" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.078985 4921 scope.go:117] "RemoveContainer" containerID="b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.093342 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.129876 4921 scope.go:117] "RemoveContainer" containerID="965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.130046 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.149211 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.149991 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-metadata" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150006 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-metadata" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.150035 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-log" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150041 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-log" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.150066 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-log" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150072 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-log" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.150101 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-api" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150119 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-api" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150928 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-metadata" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150956 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-log" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150965 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" containerName="nova-api-api" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.150986 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9916a791-4604-4652-9dd1-354d91186046" containerName="nova-metadata-log" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.153005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.171348 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.172006 4921 scope.go:117] "RemoveContainer" containerID="b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.172979 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828\": container with ID starting with b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828 not found: ID does not exist" containerID="b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.173014 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828"} err="failed to get container status \"b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828\": rpc error: code = NotFound desc = could not find container \"b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828\": container with ID starting with b026b6446f2bb70422ad38101233ab180fd6172b7af3a104ac2f459879a36828 not found: ID does not exist" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.173039 4921 scope.go:117] "RemoveContainer" containerID="965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39" Mar 18 13:46:49 crc kubenswrapper[4921]: E0318 13:46:49.173777 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39\": container with ID starting with 965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39 not found: ID does not exist" containerID="965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.173844 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39"} err="failed to get container status \"965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39\": rpc error: code = NotFound desc = could not find container \"965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39\": container with ID starting with 965b0611d2cba9120f35fdd6eae29e6e842c26c954e6f92e8e4301a13373de39 not found: ID does not exist" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.189640 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.206908 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.223772 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9916a791-4604-4652-9dd1-354d91186046" path="/var/lib/kubelet/pods/9916a791-4604-4652-9dd1-354d91186046/volumes" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.225004 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c998b6ad-897c-4d19-9216-b6950058a3e2" path="/var/lib/kubelet/pods/c998b6ad-897c-4d19-9216-b6950058a3e2/volumes" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.225928 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.231841 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.233976 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.236683 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.246879 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.276911 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec734ce5-0367-4e51-9875-3390097b2ebc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.277027 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec734ce5-0367-4e51-9875-3390097b2ebc-config-data\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.277159 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg64\" (UniqueName: \"kubernetes.io/projected/ec734ce5-0367-4e51-9875-3390097b2ebc-kube-api-access-rqg64\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.277223 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec734ce5-0367-4e51-9875-3390097b2ebc-logs\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.379944 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec734ce5-0367-4e51-9875-3390097b2ebc-config-data\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380041 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-config-data\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380335 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg64\" (UniqueName: \"kubernetes.io/projected/ec734ce5-0367-4e51-9875-3390097b2ebc-kube-api-access-rqg64\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380517 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec734ce5-0367-4e51-9875-3390097b2ebc-logs\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380650 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380683 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf6vp\" (UniqueName: \"kubernetes.io/projected/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-kube-api-access-bf6vp\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380711 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-logs\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380789 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec734ce5-0367-4e51-9875-3390097b2ebc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.380949 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec734ce5-0367-4e51-9875-3390097b2ebc-logs\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.385474 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec734ce5-0367-4e51-9875-3390097b2ebc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.392237 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec734ce5-0367-4e51-9875-3390097b2ebc-config-data\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.409901 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg64\" (UniqueName: \"kubernetes.io/projected/ec734ce5-0367-4e51-9875-3390097b2ebc-kube-api-access-rqg64\") pod \"nova-api-0\" (UID: \"ec734ce5-0367-4e51-9875-3390097b2ebc\") " pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.482202 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-config-data\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.482640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.482670 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf6vp\" (UniqueName: \"kubernetes.io/projected/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-kube-api-access-bf6vp\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.482695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-logs\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.483282 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-logs\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.487278 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.487389 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-config-data\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.500652 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.501091 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf6vp\" (UniqueName: \"kubernetes.io/projected/e1e6617b-aaa3-4d0f-81cb-1149d2be62a6-kube-api-access-bf6vp\") pod \"nova-metadata-0\" (UID: \"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6\") " pod="openstack/nova-metadata-0" Mar 18 13:46:49 crc kubenswrapper[4921]: I0318 13:46:49.554008 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 13:46:50 crc kubenswrapper[4921]: I0318 13:46:50.045398 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 13:46:50 crc kubenswrapper[4921]: W0318 13:46:50.053591 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec734ce5_0367_4e51_9875_3390097b2ebc.slice/crio-8c01a2cabf63b8eea4c4ea79db5ae776eff6325c1b1cc40993bb168b8cf27b8a WatchSource:0}: Error finding container 8c01a2cabf63b8eea4c4ea79db5ae776eff6325c1b1cc40993bb168b8cf27b8a: Status 404 returned error can't find the container with id 8c01a2cabf63b8eea4c4ea79db5ae776eff6325c1b1cc40993bb168b8cf27b8a Mar 18 13:46:50 crc kubenswrapper[4921]: I0318 13:46:50.183064 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 13:46:50 crc kubenswrapper[4921]: W0318 13:46:50.191228 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e6617b_aaa3_4d0f_81cb_1149d2be62a6.slice/crio-bfba4780a44a9533567cd7e8a651e8dcfdadb562b06e81c4fc609f36cbe305bc WatchSource:0}: Error finding container bfba4780a44a9533567cd7e8a651e8dcfdadb562b06e81c4fc609f36cbe305bc: Status 404 returned error can't find the container with id bfba4780a44a9533567cd7e8a651e8dcfdadb562b06e81c4fc609f36cbe305bc Mar 18 13:46:50 crc kubenswrapper[4921]: E0318 13:46:50.632805 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:46:50 crc kubenswrapper[4921]: E0318 13:46:50.634956 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:46:50 crc kubenswrapper[4921]: E0318 13:46:50.639603 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 13:46:50 crc kubenswrapper[4921]: E0318 13:46:50.639652 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" containerName="nova-scheduler-scheduler" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.033607 4921 generic.go:334] "Generic (PLEG): container finished" podID="396d2df9-ddb7-4514-a6e2-991b6c410448" containerID="9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6" exitCode=0 Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.033711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"396d2df9-ddb7-4514-a6e2-991b6c410448","Type":"ContainerDied","Data":"9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.037599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6","Type":"ContainerStarted","Data":"877dfcec8098b1d62ad9ee435b3f603799f25b2be513080052e6d0434326c121"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.037837 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6","Type":"ContainerStarted","Data":"908672d0f0e65199fcf92a81fc3bc3a4d172a5161633f8476e0cb2bbb7c74829"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.037849 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e6617b-aaa3-4d0f-81cb-1149d2be62a6","Type":"ContainerStarted","Data":"bfba4780a44a9533567cd7e8a651e8dcfdadb562b06e81c4fc609f36cbe305bc"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.040587 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec734ce5-0367-4e51-9875-3390097b2ebc","Type":"ContainerStarted","Data":"f2eb7ccf176555ceb95ccd25da3e9a8d6ece7453b4676c7db89c312cf8c8cdba"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.040616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec734ce5-0367-4e51-9875-3390097b2ebc","Type":"ContainerStarted","Data":"3548ed043d1e41440e43eeab5966cda8a686a7ceb584ea1ccd9f88e8a40fe782"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.040627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec734ce5-0367-4e51-9875-3390097b2ebc","Type":"ContainerStarted","Data":"8c01a2cabf63b8eea4c4ea79db5ae776eff6325c1b1cc40993bb168b8cf27b8a"} Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.076440 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.076422149 podStartE2EDuration="2.076422149s" podCreationTimestamp="2026-03-18 13:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:51.071316604 +0000 UTC m=+5830.621237273" watchObservedRunningTime="2026-03-18 13:46:51.076422149 +0000 UTC m=+5830.626342788" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.100563 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.100543938 podStartE2EDuration="2.100543938s" podCreationTimestamp="2026-03-18 13:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:51.093452856 +0000 UTC m=+5830.643373495" watchObservedRunningTime="2026-03-18 13:46:51.100543938 +0000 UTC m=+5830.650464577" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.221840 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.314668 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-combined-ca-bundle\") pod \"396d2df9-ddb7-4514-a6e2-991b6c410448\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.314728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4g9t\" (UniqueName: \"kubernetes.io/projected/396d2df9-ddb7-4514-a6e2-991b6c410448-kube-api-access-p4g9t\") pod \"396d2df9-ddb7-4514-a6e2-991b6c410448\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.314807 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-config-data\") pod \"396d2df9-ddb7-4514-a6e2-991b6c410448\" (UID: \"396d2df9-ddb7-4514-a6e2-991b6c410448\") " Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.319485 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396d2df9-ddb7-4514-a6e2-991b6c410448-kube-api-access-p4g9t" (OuterVolumeSpecName: "kube-api-access-p4g9t") pod "396d2df9-ddb7-4514-a6e2-991b6c410448" (UID: "396d2df9-ddb7-4514-a6e2-991b6c410448"). InnerVolumeSpecName "kube-api-access-p4g9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.345971 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "396d2df9-ddb7-4514-a6e2-991b6c410448" (UID: "396d2df9-ddb7-4514-a6e2-991b6c410448"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.360517 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-config-data" (OuterVolumeSpecName: "config-data") pod "396d2df9-ddb7-4514-a6e2-991b6c410448" (UID: "396d2df9-ddb7-4514-a6e2-991b6c410448"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.417397 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.417448 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4g9t\" (UniqueName: \"kubernetes.io/projected/396d2df9-ddb7-4514-a6e2-991b6c410448-kube-api-access-p4g9t\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.417463 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396d2df9-ddb7-4514-a6e2-991b6c410448-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:51 crc kubenswrapper[4921]: I0318 13:46:51.642686 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.055781 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"396d2df9-ddb7-4514-a6e2-991b6c410448","Type":"ContainerDied","Data":"f9ed5bdd44066d7d635463570e006058304ce451ebed562ee6b5140afe4e3511"} Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.055877 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.056456 4921 scope.go:117] "RemoveContainer" containerID="9dc57a8fae41d9e5e854a7be713531f297c1e188146f336849243f288322c4c6" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.138093 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.154437 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.168076 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:46:52 crc kubenswrapper[4921]: E0318 13:46:52.168632 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396d2df9-ddb7-4514-a6e2-991b6c410448" containerName="nova-cell0-conductor-conductor" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.168647 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="396d2df9-ddb7-4514-a6e2-991b6c410448" containerName="nova-cell0-conductor-conductor" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.169084 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="396d2df9-ddb7-4514-a6e2-991b6c410448" containerName="nova-cell0-conductor-conductor" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.170189 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.179810 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.188834 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.239411 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.239457 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.239637 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mlwl\" (UniqueName: \"kubernetes.io/projected/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-kube-api-access-2mlwl\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.343732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mlwl\" (UniqueName: \"kubernetes.io/projected/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-kube-api-access-2mlwl\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.343806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.343831 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.349682 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.350140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.363710 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mlwl\" (UniqueName: \"kubernetes.io/projected/d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4-kube-api-access-2mlwl\") pod \"nova-cell0-conductor-0\" (UID: \"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4\") " pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:52 crc kubenswrapper[4921]: I0318 13:46:52.495197 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:53 crc kubenswrapper[4921]: I0318 13:46:53.037848 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 13:46:53 crc kubenswrapper[4921]: I0318 13:46:53.065121 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4","Type":"ContainerStarted","Data":"a472ab5004a709d2ebffbb45c4b228be979b70416d9a274ac6efe74fcbdf4ce3"} Mar 18 13:46:53 crc kubenswrapper[4921]: I0318 13:46:53.220858 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396d2df9-ddb7-4514-a6e2-991b6c410448" path="/var/lib/kubelet/pods/396d2df9-ddb7-4514-a6e2-991b6c410448/volumes" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.077567 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4","Type":"ContainerStarted","Data":"7459131f852c03bd153fc5553f0b4345985b4d28c48fd615d0aea24f33e09514"} Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.077700 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.093280 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.093263265 podStartE2EDuration="2.093263265s" podCreationTimestamp="2026-03-18 13:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:54.091192036 +0000 UTC m=+5833.641112675" watchObservedRunningTime="2026-03-18 13:46:54.093263265 +0000 UTC m=+5833.643183904" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.554905 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.586785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-combined-ca-bundle\") pod \"89230afb-39c8-4ada-a134-d329c12c54d9\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.586835 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w4w7\" (UniqueName: \"kubernetes.io/projected/89230afb-39c8-4ada-a134-d329c12c54d9-kube-api-access-4w4w7\") pod \"89230afb-39c8-4ada-a134-d329c12c54d9\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.587062 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-config-data\") pod \"89230afb-39c8-4ada-a134-d329c12c54d9\" (UID: \"89230afb-39c8-4ada-a134-d329c12c54d9\") " Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.605799 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89230afb-39c8-4ada-a134-d329c12c54d9-kube-api-access-4w4w7" (OuterVolumeSpecName: "kube-api-access-4w4w7") pod "89230afb-39c8-4ada-a134-d329c12c54d9" (UID: "89230afb-39c8-4ada-a134-d329c12c54d9"). InnerVolumeSpecName "kube-api-access-4w4w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.635598 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-config-data" (OuterVolumeSpecName: "config-data") pod "89230afb-39c8-4ada-a134-d329c12c54d9" (UID: "89230afb-39c8-4ada-a134-d329c12c54d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.642303 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89230afb-39c8-4ada-a134-d329c12c54d9" (UID: "89230afb-39c8-4ada-a134-d329c12c54d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.689106 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.689467 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89230afb-39c8-4ada-a134-d329c12c54d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:54 crc kubenswrapper[4921]: I0318 13:46:54.689480 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w4w7\" (UniqueName: \"kubernetes.io/projected/89230afb-39c8-4ada-a134-d329c12c54d9-kube-api-access-4w4w7\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.090628 4921 generic.go:334] "Generic (PLEG): container finished" podID="89230afb-39c8-4ada-a134-d329c12c54d9" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" exitCode=0 Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.090681 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.090722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"89230afb-39c8-4ada-a134-d329c12c54d9","Type":"ContainerDied","Data":"62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414"} Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.091223 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"89230afb-39c8-4ada-a134-d329c12c54d9","Type":"ContainerDied","Data":"82de5935d1836c81370803c9d4ad89c3cee7754031402a34539e8441bbd39058"} Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.091276 4921 scope.go:117] "RemoveContainer" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.122955 4921 scope.go:117] "RemoveContainer" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" Mar 18 13:46:55 crc kubenswrapper[4921]: E0318 13:46:55.127325 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414\": container with ID starting with 62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414 not found: ID does not exist" containerID="62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.127376 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414"} err="failed to get container status \"62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414\": rpc error: code = NotFound desc = could not find container \"62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414\": container with ID starting with 62ce4bb9c6ca4929e6a01828c48c48f2cb68c55e56f86fe61fa074bb1bc7f414 not found: ID does not exist" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.146577 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.163924 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.185243 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:46:55 crc kubenswrapper[4921]: E0318 13:46:55.186869 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" containerName="nova-scheduler-scheduler" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.187029 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" containerName="nova-scheduler-scheduler" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.187704 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" containerName="nova-scheduler-scheduler" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.189229 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.196020 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.237622 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89230afb-39c8-4ada-a134-d329c12c54d9" path="/var/lib/kubelet/pods/89230afb-39c8-4ada-a134-d329c12c54d9/volumes" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.238336 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.329845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2h7r\" (UniqueName: \"kubernetes.io/projected/6f4b2844-289a-49cc-aba0-7b97e9105181-kube-api-access-t2h7r\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.330174 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4b2844-289a-49cc-aba0-7b97e9105181-config-data\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.330316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4b2844-289a-49cc-aba0-7b97e9105181-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.432155 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4b2844-289a-49cc-aba0-7b97e9105181-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.432284 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2h7r\" (UniqueName: \"kubernetes.io/projected/6f4b2844-289a-49cc-aba0-7b97e9105181-kube-api-access-t2h7r\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.432404 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4b2844-289a-49cc-aba0-7b97e9105181-config-data\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.445423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4b2844-289a-49cc-aba0-7b97e9105181-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.450072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4b2844-289a-49cc-aba0-7b97e9105181-config-data\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.451358 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2h7r\" (UniqueName: \"kubernetes.io/projected/6f4b2844-289a-49cc-aba0-7b97e9105181-kube-api-access-t2h7r\") pod \"nova-scheduler-0\" (UID: \"6f4b2844-289a-49cc-aba0-7b97e9105181\") " pod="openstack/nova-scheduler-0" Mar 18 13:46:55 crc kubenswrapper[4921]: I0318 13:46:55.547444 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 13:46:56 crc kubenswrapper[4921]: I0318 13:46:56.033153 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 13:46:56 crc kubenswrapper[4921]: I0318 13:46:56.129610 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f4b2844-289a-49cc-aba0-7b97e9105181","Type":"ContainerStarted","Data":"8fd7557eefd28e207f54e2d1120d2d7a8f40f90ff81a580e0147bcef8007b1e7"} Mar 18 13:46:56 crc kubenswrapper[4921]: I0318 13:46:56.642550 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:56 crc kubenswrapper[4921]: I0318 13:46:56.680153 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:57 crc kubenswrapper[4921]: I0318 13:46:57.140725 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f4b2844-289a-49cc-aba0-7b97e9105181","Type":"ContainerStarted","Data":"f935b3afe2398b0ede238a46d0cd4a73682e27a4f2cf9357f153774f7bb77f99"} Mar 18 13:46:57 crc kubenswrapper[4921]: I0318 13:46:57.151766 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 13:46:57 crc kubenswrapper[4921]: I0318 13:46:57.190629 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.19059698 podStartE2EDuration="2.19059698s" podCreationTimestamp="2026-03-18 13:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:46:57.164697841 +0000 UTC m=+5836.714618490" watchObservedRunningTime="2026-03-18 13:46:57.19059698 +0000 UTC m=+5836.740517659" Mar 18 13:46:57 crc kubenswrapper[4921]: I0318 13:46:57.333593 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 13:46:59 crc kubenswrapper[4921]: I0318 13:46:59.208904 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:46:59 crc kubenswrapper[4921]: E0318 13:46:59.209452 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:46:59 crc kubenswrapper[4921]: I0318 13:46:59.501336 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:46:59 crc kubenswrapper[4921]: I0318 13:46:59.502740 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 13:46:59 crc kubenswrapper[4921]: I0318 13:46:59.554934 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:46:59 crc kubenswrapper[4921]: I0318 13:46:59.555518 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 13:47:00 crc kubenswrapper[4921]: I0318 13:47:00.542483 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec734ce5-0367-4e51-9875-3390097b2ebc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.114:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:47:00 crc kubenswrapper[4921]: I0318 13:47:00.547890 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 13:47:00 crc kubenswrapper[4921]: I0318 13:47:00.667291 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e1e6617b-aaa3-4d0f-81cb-1149d2be62a6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.115:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:47:00 crc kubenswrapper[4921]: I0318 13:47:00.667592 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec734ce5-0367-4e51-9875-3390097b2ebc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.114:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:47:00 crc kubenswrapper[4921]: I0318 13:47:00.667604 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e1e6617b-aaa3-4d0f-81cb-1149d2be62a6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.115:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 13:47:02 crc kubenswrapper[4921]: I0318 13:47:02.522627 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.727668 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.730206 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.737854 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.743270 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.799219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.799340 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.799478 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.799527 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.799773 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqh6s\" (UniqueName: \"kubernetes.io/projected/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-kube-api-access-cqh6s\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.800053 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.901318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.901378 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.901433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqh6s\" (UniqueName: \"kubernetes.io/projected/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-kube-api-access-cqh6s\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.901509 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.901542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.901575 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.902058 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.908999 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.910280 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.915260 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.918401 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqh6s\" (UniqueName: \"kubernetes.io/projected/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-kube-api-access-cqh6s\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:03 crc kubenswrapper[4921]: I0318 13:47:03.919767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:04 crc kubenswrapper[4921]: I0318 13:47:04.074458 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:47:04 crc kubenswrapper[4921]: I0318 13:47:04.541968 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:04 crc kubenswrapper[4921]: W0318 13:47:04.551221 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4f2be2a_c5b6_419a_828b_7381fc7a64f8.slice/crio-da8278b7ea92ffd500603660bc297b94016759c2138946f940708e30df4a5aea WatchSource:0}: Error finding container da8278b7ea92ffd500603660bc297b94016759c2138946f940708e30df4a5aea: Status 404 returned error can't find the container with id da8278b7ea92ffd500603660bc297b94016759c2138946f940708e30df4a5aea Mar 18 13:47:05 crc kubenswrapper[4921]: I0318 13:47:05.243366 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4f2be2a-c5b6-419a-828b-7381fc7a64f8","Type":"ContainerStarted","Data":"da8278b7ea92ffd500603660bc297b94016759c2138946f940708e30df4a5aea"} Mar 18 13:47:05 crc kubenswrapper[4921]: I0318 13:47:05.549371 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 13:47:05 crc kubenswrapper[4921]: I0318 13:47:05.633443 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 13:47:05 crc kubenswrapper[4921]: I0318 13:47:05.723599 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:47:05 crc kubenswrapper[4921]: I0318 13:47:05.723909 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api-log" containerID="cri-o://7af8adaf6f105cace9e678f60dd91ed48a480a68408b9799e3ce3aa5ff9a366c" gracePeriod=30 Mar 18 13:47:05 crc kubenswrapper[4921]: I0318 13:47:05.724141 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api" containerID="cri-o://eced214ead4bcf784250d418e904c2456bbb41573533354b3ae6e720199dd2d0" gracePeriod=30 Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.255521 4921 generic.go:334] "Generic (PLEG): container finished" podID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerID="7af8adaf6f105cace9e678f60dd91ed48a480a68408b9799e3ce3aa5ff9a366c" exitCode=143 Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.255624 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18d6390e-8783-44aa-8a07-9b522fb9a4d2","Type":"ContainerDied","Data":"7af8adaf6f105cace9e678f60dd91ed48a480a68408b9799e3ce3aa5ff9a366c"} Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.258370 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4f2be2a-c5b6-419a-828b-7381fc7a64f8","Type":"ContainerStarted","Data":"a0dc5eb474293dd3496a2b98f0d038994cd9fd5037cc8a20e995cf5844cd0e9c"} Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.302068 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.372725 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.374674 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.377102 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.386141 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.468824 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.468918 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-dev\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.468961 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469021 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-run\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469285 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469349 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-sys\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469437 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469666 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469882 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.469995 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.470042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.470080 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg8v2\" (UniqueName: \"kubernetes.io/projected/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-kube-api-access-jg8v2\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572165 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572273 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572310 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-sys\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572348 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572402 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572426 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-sys\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572497 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572652 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572693 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572743 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg8v2\" (UniqueName: \"kubernetes.io/projected/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-kube-api-access-jg8v2\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572837 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.572934 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573009 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-dev\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573126 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573151 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573257 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-dev\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-run\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573318 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573303 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573281 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-run\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.573607 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.577975 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.578290 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.578343 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.578820 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.582578 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.603665 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg8v2\" (UniqueName: \"kubernetes.io/projected/794e18b3-f1c7-4e09-a80c-7a4b46bd1636-kube-api-access-jg8v2\") pod \"cinder-volume-volume1-0\" (UID: \"794e18b3-f1c7-4e09-a80c-7a4b46bd1636\") " pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:06 crc kubenswrapper[4921]: I0318 13:47:06.699943 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.127030 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.133164 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.136243 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.160238 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.188670 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-run\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.188748 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.188816 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-ceph\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.188924 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.188988 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189050 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-sys\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189178 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-scripts\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189238 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-config-data-custom\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189405 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189497 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-lib-modules\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189552 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-nvme\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189619 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7f5\" (UniqueName: \"kubernetes.io/projected/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-kube-api-access-ht7f5\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.189811 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-dev\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.190057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.190096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-config-data\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.274512 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4f2be2a-c5b6-419a-828b-7381fc7a64f8","Type":"ContainerStarted","Data":"2cfa698474a92e27beb974eea8f2b37ff7e60764befb1ed9d863746f49e13a27"} Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.296244 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.29621827 podStartE2EDuration="4.29621827s" podCreationTimestamp="2026-03-18 13:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:47:07.292198535 +0000 UTC m=+5846.842119184" watchObservedRunningTime="2026-03-18 13:47:07.29621827 +0000 UTC m=+5846.846138909" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297272 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-sys\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297358 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-scripts\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297398 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-config-data-custom\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297419 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-sys\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297429 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297530 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297556 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-lib-modules\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297533 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-lib-modules\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297599 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-nvme\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297631 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7f5\" (UniqueName: \"kubernetes.io/projected/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-kube-api-access-ht7f5\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297674 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-dev\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297708 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-config-data\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297727 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297778 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-run\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297803 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297828 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-ceph\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297867 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297903 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.297927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.298014 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.298285 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.298380 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-nvme\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.298787 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-dev\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.301003 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-run\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.301014 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.301232 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.305962 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-config-data-custom\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.306184 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.307947 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-scripts\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.314012 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-config-data\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.321737 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-ceph\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.325708 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7f5\" (UniqueName: \"kubernetes.io/projected/47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23-kube-api-access-ht7f5\") pod \"cinder-backup-0\" (UID: \"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23\") " pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.343840 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.467935 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.501288 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.501510 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.555472 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:47:07 crc kubenswrapper[4921]: I0318 13:47:07.555755 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:08.282445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"794e18b3-f1c7-4e09-a80c-7a4b46bd1636","Type":"ContainerStarted","Data":"3baf436d9bfc143c6631e7509093ba35dba1cd00aed84dcfd28d284796fc9086"} Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:08.871852 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.111:8776/healthcheck\": read tcp 10.217.0.2:36424->10.217.1.111:8776: read: connection reset by peer" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.076088 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.332935 4921 generic.go:334] "Generic (PLEG): container finished" podID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerID="eced214ead4bcf784250d418e904c2456bbb41573533354b3ae6e720199dd2d0" exitCode=0 Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.333388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18d6390e-8783-44aa-8a07-9b522fb9a4d2","Type":"ContainerDied","Data":"eced214ead4bcf784250d418e904c2456bbb41573533354b3ae6e720199dd2d0"} Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.336521 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"794e18b3-f1c7-4e09-a80c-7a4b46bd1636","Type":"ContainerStarted","Data":"4a22e8667fe3cfbfc969d9619fd5e2bc39e0384568a3503731ff3321ae46b8d9"} Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.451469 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.510951 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.511528 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.519343 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.521910 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545219 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh5lh\" (UniqueName: \"kubernetes.io/projected/18d6390e-8783-44aa-8a07-9b522fb9a4d2-kube-api-access-jh5lh\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545263 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-combined-ca-bundle\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18d6390e-8783-44aa-8a07-9b522fb9a4d2-etc-machine-id\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545347 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data-custom\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545374 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18d6390e-8783-44aa-8a07-9b522fb9a4d2-logs\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.545437 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-scripts\") pod \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\" (UID: \"18d6390e-8783-44aa-8a07-9b522fb9a4d2\") " Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.549309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18d6390e-8783-44aa-8a07-9b522fb9a4d2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.549806 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d6390e-8783-44aa-8a07-9b522fb9a4d2-logs" (OuterVolumeSpecName: "logs") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.553923 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-scripts" (OuterVolumeSpecName: "scripts") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.556504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d6390e-8783-44aa-8a07-9b522fb9a4d2-kube-api-access-jh5lh" (OuterVolumeSpecName: "kube-api-access-jh5lh") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "kube-api-access-jh5lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.571063 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.571697 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.572298 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.598837 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.628753 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.646984 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.647029 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh5lh\" (UniqueName: \"kubernetes.io/projected/18d6390e-8783-44aa-8a07-9b522fb9a4d2-kube-api-access-jh5lh\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.647042 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.647053 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18d6390e-8783-44aa-8a07-9b522fb9a4d2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.647064 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.647075 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18d6390e-8783-44aa-8a07-9b522fb9a4d2-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.654274 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data" (OuterVolumeSpecName: "config-data") pod "18d6390e-8783-44aa-8a07-9b522fb9a4d2" (UID: "18d6390e-8783-44aa-8a07-9b522fb9a4d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.742592 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 18 13:47:09 crc kubenswrapper[4921]: W0318 13:47:09.743754 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dbc0b8_741d_4d4e_ab32_0bf3ab3cca23.slice/crio-11ec3ff37aca698e60313270f8fa1d0084ecb1426ee0e0953961bf7a042740bf WatchSource:0}: Error finding container 11ec3ff37aca698e60313270f8fa1d0084ecb1426ee0e0953961bf7a042740bf: Status 404 returned error can't find the container with id 11ec3ff37aca698e60313270f8fa1d0084ecb1426ee0e0953961bf7a042740bf Mar 18 13:47:09 crc kubenswrapper[4921]: I0318 13:47:09.748837 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d6390e-8783-44aa-8a07-9b522fb9a4d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.347763 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"18d6390e-8783-44aa-8a07-9b522fb9a4d2","Type":"ContainerDied","Data":"f65ea92f0ad6fc0b968c5c13f3481ea5ed88179795e6b11c618ade68b40abf5b"} Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.348154 4921 scope.go:117] "RemoveContainer" containerID="eced214ead4bcf784250d418e904c2456bbb41573533354b3ae6e720199dd2d0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.347799 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.351563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"794e18b3-f1c7-4e09-a80c-7a4b46bd1636","Type":"ContainerStarted","Data":"7f6e508cfddb5ed206fc2ea8b7d286a9cbc9ef485ff314922b771e092c7e2646"} Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.353612 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23","Type":"ContainerStarted","Data":"11ec3ff37aca698e60313270f8fa1d0084ecb1426ee0e0953961bf7a042740bf"} Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.359432 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.394497 4921 scope.go:117] "RemoveContainer" containerID="7af8adaf6f105cace9e678f60dd91ed48a480a68408b9799e3ce3aa5ff9a366c" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.459052 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.899676976 podStartE2EDuration="4.459034673s" podCreationTimestamp="2026-03-18 13:47:06 +0000 UTC" firstStartedPulling="2026-03-18 13:47:07.354829443 +0000 UTC m=+5846.904750082" lastFinishedPulling="2026-03-18 13:47:08.91418714 +0000 UTC m=+5848.464107779" observedRunningTime="2026-03-18 13:47:10.394761918 +0000 UTC m=+5849.944682567" watchObservedRunningTime="2026-03-18 13:47:10.459034673 +0000 UTC m=+5850.008955312" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.525444 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.548379 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.589734 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:47:10 crc kubenswrapper[4921]: E0318 13:47:10.590167 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.590187 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api" Mar 18 13:47:10 crc kubenswrapper[4921]: E0318 13:47:10.590257 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api-log" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.590265 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api-log" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.590452 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api-log" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.590482 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" containerName="cinder-api" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.591669 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.596418 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.598376 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0de51569-4dc0-4e65-a286-461d82659895-logs\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777642 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssmqq\" (UniqueName: \"kubernetes.io/projected/0de51569-4dc0-4e65-a286-461d82659895-kube-api-access-ssmqq\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777694 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-config-data\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777730 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-scripts\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777757 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-config-data-custom\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777809 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0de51569-4dc0-4e65-a286-461d82659895-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.777844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.878936 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-config-data\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.878998 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-scripts\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879029 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-config-data-custom\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879082 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0de51569-4dc0-4e65-a286-461d82659895-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879159 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879211 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0de51569-4dc0-4e65-a286-461d82659895-logs\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879241 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssmqq\" (UniqueName: \"kubernetes.io/projected/0de51569-4dc0-4e65-a286-461d82659895-kube-api-access-ssmqq\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879501 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0de51569-4dc0-4e65-a286-461d82659895-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.879947 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0de51569-4dc0-4e65-a286-461d82659895-logs\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.883534 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-scripts\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.883800 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.884294 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-config-data\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.885908 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0de51569-4dc0-4e65-a286-461d82659895-config-data-custom\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.896792 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssmqq\" (UniqueName: \"kubernetes.io/projected/0de51569-4dc0-4e65-a286-461d82659895-kube-api-access-ssmqq\") pod \"cinder-api-0\" (UID: \"0de51569-4dc0-4e65-a286-461d82659895\") " pod="openstack/cinder-api-0" Mar 18 13:47:10 crc kubenswrapper[4921]: I0318 13:47:10.923604 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 13:47:11 crc kubenswrapper[4921]: I0318 13:47:11.245151 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d6390e-8783-44aa-8a07-9b522fb9a4d2" path="/var/lib/kubelet/pods/18d6390e-8783-44aa-8a07-9b522fb9a4d2/volumes" Mar 18 13:47:11 crc kubenswrapper[4921]: I0318 13:47:11.411317 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23","Type":"ContainerStarted","Data":"dd735ff5266377e1866a65ecf9699ef32d0c54d0e6a105cf91f56c3eeb1f0e04"} Mar 18 13:47:11 crc kubenswrapper[4921]: I0318 13:47:11.411625 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23","Type":"ContainerStarted","Data":"f91a553985654b4a29e5a3b2e9dab048b2fa433d8847f44a2c45f96deea27d46"} Mar 18 13:47:11 crc kubenswrapper[4921]: I0318 13:47:11.481123 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.508980189 podStartE2EDuration="4.481093752s" podCreationTimestamp="2026-03-18 13:47:07 +0000 UTC" firstStartedPulling="2026-03-18 13:47:09.747726016 +0000 UTC m=+5849.297646655" lastFinishedPulling="2026-03-18 13:47:10.719839579 +0000 UTC m=+5850.269760218" observedRunningTime="2026-03-18 13:47:11.479322601 +0000 UTC m=+5851.029243260" watchObservedRunningTime="2026-03-18 13:47:11.481093752 +0000 UTC m=+5851.031014391" Mar 18 13:47:11 crc kubenswrapper[4921]: I0318 13:47:11.520453 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 13:47:11 crc kubenswrapper[4921]: W0318 13:47:11.521696 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0de51569_4dc0_4e65_a286_461d82659895.slice/crio-358980a2392260f81c97ca94f71493f4286219dadcd8667fec2945e2a8522957 WatchSource:0}: Error finding container 358980a2392260f81c97ca94f71493f4286219dadcd8667fec2945e2a8522957: Status 404 returned error can't find the container with id 358980a2392260f81c97ca94f71493f4286219dadcd8667fec2945e2a8522957 Mar 18 13:47:11 crc kubenswrapper[4921]: I0318 13:47:11.700141 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:12 crc kubenswrapper[4921]: I0318 13:47:12.209618 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:47:12 crc kubenswrapper[4921]: E0318 13:47:12.210147 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:47:12 crc kubenswrapper[4921]: I0318 13:47:12.446958 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0de51569-4dc0-4e65-a286-461d82659895","Type":"ContainerStarted","Data":"87cab97d6e648d6ea71e22bfabca5d5183831ea1ee95cebab6cb15f791cafa7c"} Mar 18 13:47:12 crc kubenswrapper[4921]: I0318 13:47:12.447019 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0de51569-4dc0-4e65-a286-461d82659895","Type":"ContainerStarted","Data":"358980a2392260f81c97ca94f71493f4286219dadcd8667fec2945e2a8522957"} Mar 18 13:47:12 crc kubenswrapper[4921]: I0318 13:47:12.468991 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 18 13:47:13 crc kubenswrapper[4921]: I0318 13:47:13.457984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0de51569-4dc0-4e65-a286-461d82659895","Type":"ContainerStarted","Data":"9a679acff10a17e1aef3a2e64a96d91ca6095b55b462fb71efa6904f456ed542"} Mar 18 13:47:13 crc kubenswrapper[4921]: I0318 13:47:13.458565 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 13:47:13 crc kubenswrapper[4921]: I0318 13:47:13.478136 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.478087932 podStartE2EDuration="3.478087932s" podCreationTimestamp="2026-03-18 13:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:47:13.474653053 +0000 UTC m=+5853.024573692" watchObservedRunningTime="2026-03-18 13:47:13.478087932 +0000 UTC m=+5853.028008571" Mar 18 13:47:14 crc kubenswrapper[4921]: I0318 13:47:14.264451 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 13:47:14 crc kubenswrapper[4921]: I0318 13:47:14.344184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:14 crc kubenswrapper[4921]: I0318 13:47:14.465361 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="cinder-scheduler" containerID="cri-o://a0dc5eb474293dd3496a2b98f0d038994cd9fd5037cc8a20e995cf5844cd0e9c" gracePeriod=30 Mar 18 13:47:14 crc kubenswrapper[4921]: I0318 13:47:14.465414 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="probe" containerID="cri-o://2cfa698474a92e27beb974eea8f2b37ff7e60764befb1ed9d863746f49e13a27" gracePeriod=30 Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.538247 4921 generic.go:334] "Generic (PLEG): container finished" podID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerID="2cfa698474a92e27beb974eea8f2b37ff7e60764befb1ed9d863746f49e13a27" exitCode=0 Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.538547 4921 generic.go:334] "Generic (PLEG): container finished" podID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerID="a0dc5eb474293dd3496a2b98f0d038994cd9fd5037cc8a20e995cf5844cd0e9c" exitCode=0 Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.538572 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4f2be2a-c5b6-419a-828b-7381fc7a64f8","Type":"ContainerDied","Data":"2cfa698474a92e27beb974eea8f2b37ff7e60764befb1ed9d863746f49e13a27"} Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.538606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4f2be2a-c5b6-419a-828b-7381fc7a64f8","Type":"ContainerDied","Data":"a0dc5eb474293dd3496a2b98f0d038994cd9fd5037cc8a20e995cf5844cd0e9c"} Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.772619 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.881299 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqh6s\" (UniqueName: \"kubernetes.io/projected/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-kube-api-access-cqh6s\") pod \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.881355 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-combined-ca-bundle\") pod \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.881423 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-etc-machine-id\") pod \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.881509 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data\") pod \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.881534 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-scripts\") pod \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.881664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data-custom\") pod \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\" (UID: \"b4f2be2a-c5b6-419a-828b-7381fc7a64f8\") " Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.882529 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b4f2be2a-c5b6-419a-828b-7381fc7a64f8" (UID: "b4f2be2a-c5b6-419a-828b-7381fc7a64f8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.888614 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b4f2be2a-c5b6-419a-828b-7381fc7a64f8" (UID: "b4f2be2a-c5b6-419a-828b-7381fc7a64f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.889280 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-kube-api-access-cqh6s" (OuterVolumeSpecName: "kube-api-access-cqh6s") pod "b4f2be2a-c5b6-419a-828b-7381fc7a64f8" (UID: "b4f2be2a-c5b6-419a-828b-7381fc7a64f8"). InnerVolumeSpecName "kube-api-access-cqh6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.897256 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-scripts" (OuterVolumeSpecName: "scripts") pod "b4f2be2a-c5b6-419a-828b-7381fc7a64f8" (UID: "b4f2be2a-c5b6-419a-828b-7381fc7a64f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.959570 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4f2be2a-c5b6-419a-828b-7381fc7a64f8" (UID: "b4f2be2a-c5b6-419a-828b-7381fc7a64f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.977458 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data" (OuterVolumeSpecName: "config-data") pod "b4f2be2a-c5b6-419a-828b-7381fc7a64f8" (UID: "b4f2be2a-c5b6-419a-828b-7381fc7a64f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.985454 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.985496 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqh6s\" (UniqueName: \"kubernetes.io/projected/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-kube-api-access-cqh6s\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.985508 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.985518 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.985528 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:15 crc kubenswrapper[4921]: I0318 13:47:15.985540 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f2be2a-c5b6-419a-828b-7381fc7a64f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.552435 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b4f2be2a-c5b6-419a-828b-7381fc7a64f8","Type":"ContainerDied","Data":"da8278b7ea92ffd500603660bc297b94016759c2138946f940708e30df4a5aea"} Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.552490 4921 scope.go:117] "RemoveContainer" containerID="2cfa698474a92e27beb974eea8f2b37ff7e60764befb1ed9d863746f49e13a27" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.552515 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.589658 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.598220 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.599015 4921 scope.go:117] "RemoveContainer" containerID="a0dc5eb474293dd3496a2b98f0d038994cd9fd5037cc8a20e995cf5844cd0e9c" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.623081 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:16 crc kubenswrapper[4921]: E0318 13:47:16.623583 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="probe" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.623602 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="probe" Mar 18 13:47:16 crc kubenswrapper[4921]: E0318 13:47:16.623620 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="cinder-scheduler" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.623626 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="cinder-scheduler" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.623809 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="probe" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.623834 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" containerName="cinder-scheduler" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.624786 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.626730 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.632989 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.801376 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.801682 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-scripts\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.801797 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.801893 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-config-data\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.801963 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229d2\" (UniqueName: \"kubernetes.io/projected/c240b2e2-9404-4c37-a827-5183cc419888-kube-api-access-229d2\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.802082 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c240b2e2-9404-4c37-a827-5183cc419888-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904068 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904191 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-scripts\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904221 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904262 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-config-data\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229d2\" (UniqueName: \"kubernetes.io/projected/c240b2e2-9404-4c37-a827-5183cc419888-kube-api-access-229d2\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904373 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c240b2e2-9404-4c37-a827-5183cc419888-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.904473 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c240b2e2-9404-4c37-a827-5183cc419888-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.908703 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-scripts\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.909408 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-config-data\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.909857 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.913653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c240b2e2-9404-4c37-a827-5183cc419888-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.947661 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229d2\" (UniqueName: \"kubernetes.io/projected/c240b2e2-9404-4c37-a827-5183cc419888-kube-api-access-229d2\") pod \"cinder-scheduler-0\" (UID: \"c240b2e2-9404-4c37-a827-5183cc419888\") " pod="openstack/cinder-scheduler-0" Mar 18 13:47:16 crc kubenswrapper[4921]: I0318 13:47:16.993080 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 18 13:47:17 crc kubenswrapper[4921]: I0318 13:47:17.226436 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f2be2a-c5b6-419a-828b-7381fc7a64f8" path="/var/lib/kubelet/pods/b4f2be2a-c5b6-419a-828b-7381fc7a64f8/volumes" Mar 18 13:47:17 crc kubenswrapper[4921]: I0318 13:47:17.247342 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 13:47:17 crc kubenswrapper[4921]: I0318 13:47:17.683081 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 18 13:47:17 crc kubenswrapper[4921]: W0318 13:47:17.731690 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc240b2e2_9404_4c37_a827_5183cc419888.slice/crio-367d69998e4b6a0c3d0333adabb68b580bad277c025011444b56c59422624f49 WatchSource:0}: Error finding container 367d69998e4b6a0c3d0333adabb68b580bad277c025011444b56c59422624f49: Status 404 returned error can't find the container with id 367d69998e4b6a0c3d0333adabb68b580bad277c025011444b56c59422624f49 Mar 18 13:47:17 crc kubenswrapper[4921]: I0318 13:47:17.754208 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 13:47:18 crc kubenswrapper[4921]: I0318 13:47:18.598646 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c240b2e2-9404-4c37-a827-5183cc419888","Type":"ContainerStarted","Data":"f8724c79f6159f8db2607e6db68f98c10548a911d22168a002ce82b3f4a2f99e"} Mar 18 13:47:18 crc kubenswrapper[4921]: I0318 13:47:18.598957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c240b2e2-9404-4c37-a827-5183cc419888","Type":"ContainerStarted","Data":"367d69998e4b6a0c3d0333adabb68b580bad277c025011444b56c59422624f49"} Mar 18 13:47:19 crc kubenswrapper[4921]: I0318 13:47:19.611780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c240b2e2-9404-4c37-a827-5183cc419888","Type":"ContainerStarted","Data":"b8277c5702e0757d3708eb7db5d75481827622a692ce2d76316f785ba21a7d3f"} Mar 18 13:47:19 crc kubenswrapper[4921]: I0318 13:47:19.637622 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.6375993859999998 podStartE2EDuration="3.637599386s" podCreationTimestamp="2026-03-18 13:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:47:19.626836769 +0000 UTC m=+5859.176757398" watchObservedRunningTime="2026-03-18 13:47:19.637599386 +0000 UTC m=+5859.187520025" Mar 18 13:47:22 crc kubenswrapper[4921]: I0318 13:47:22.248612 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 13:47:22 crc kubenswrapper[4921]: I0318 13:47:22.924765 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 13:47:25 crc kubenswrapper[4921]: I0318 13:47:25.211694 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:47:25 crc kubenswrapper[4921]: E0318 13:47:25.212771 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:47:27 crc kubenswrapper[4921]: I0318 13:47:27.466473 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 13:47:38 crc kubenswrapper[4921]: I0318 13:47:38.210229 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:47:38 crc kubenswrapper[4921]: E0318 13:47:38.210989 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:47:50 crc kubenswrapper[4921]: I0318 13:47:50.209762 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:47:50 crc kubenswrapper[4921]: E0318 13:47:50.211103 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.157521 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564028-ddjgn"] Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.161344 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.165371 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.165527 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.165546 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.172085 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-ddjgn"] Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.290323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v6gj\" (UniqueName: \"kubernetes.io/projected/97dab222-fe06-4654-bf9a-49f3790bbebf-kube-api-access-4v6gj\") pod \"auto-csr-approver-29564028-ddjgn\" (UID: \"97dab222-fe06-4654-bf9a-49f3790bbebf\") " pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.392478 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v6gj\" (UniqueName: \"kubernetes.io/projected/97dab222-fe06-4654-bf9a-49f3790bbebf-kube-api-access-4v6gj\") pod \"auto-csr-approver-29564028-ddjgn\" (UID: \"97dab222-fe06-4654-bf9a-49f3790bbebf\") " pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.416886 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v6gj\" (UniqueName: \"kubernetes.io/projected/97dab222-fe06-4654-bf9a-49f3790bbebf-kube-api-access-4v6gj\") pod \"auto-csr-approver-29564028-ddjgn\" (UID: \"97dab222-fe06-4654-bf9a-49f3790bbebf\") " pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.481094 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:00 crc kubenswrapper[4921]: I0318 13:48:00.985629 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-ddjgn"] Mar 18 13:48:01 crc kubenswrapper[4921]: I0318 13:48:01.978861 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" event={"ID":"97dab222-fe06-4654-bf9a-49f3790bbebf","Type":"ContainerStarted","Data":"55d692c51e0ba2cc468f1e360ebe908048f59e6d06506ed7a481b76bd3b52095"} Mar 18 13:48:04 crc kubenswrapper[4921]: I0318 13:48:04.003530 4921 generic.go:334] "Generic (PLEG): container finished" podID="97dab222-fe06-4654-bf9a-49f3790bbebf" containerID="fba308647055a187ea1e18c06688340a871327483b7f6cf316017ba58e32e4eb" exitCode=0 Mar 18 13:48:04 crc kubenswrapper[4921]: I0318 13:48:04.003695 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" event={"ID":"97dab222-fe06-4654-bf9a-49f3790bbebf","Type":"ContainerDied","Data":"fba308647055a187ea1e18c06688340a871327483b7f6cf316017ba58e32e4eb"} Mar 18 13:48:04 crc kubenswrapper[4921]: I0318 13:48:04.209208 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:48:04 crc kubenswrapper[4921]: E0318 13:48:04.209487 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:48:05 crc kubenswrapper[4921]: I0318 13:48:05.388792 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:05 crc kubenswrapper[4921]: I0318 13:48:05.512764 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v6gj\" (UniqueName: \"kubernetes.io/projected/97dab222-fe06-4654-bf9a-49f3790bbebf-kube-api-access-4v6gj\") pod \"97dab222-fe06-4654-bf9a-49f3790bbebf\" (UID: \"97dab222-fe06-4654-bf9a-49f3790bbebf\") " Mar 18 13:48:05 crc kubenswrapper[4921]: I0318 13:48:05.556351 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97dab222-fe06-4654-bf9a-49f3790bbebf-kube-api-access-4v6gj" (OuterVolumeSpecName: "kube-api-access-4v6gj") pod "97dab222-fe06-4654-bf9a-49f3790bbebf" (UID: "97dab222-fe06-4654-bf9a-49f3790bbebf"). InnerVolumeSpecName "kube-api-access-4v6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:48:05 crc kubenswrapper[4921]: I0318 13:48:05.617330 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v6gj\" (UniqueName: \"kubernetes.io/projected/97dab222-fe06-4654-bf9a-49f3790bbebf-kube-api-access-4v6gj\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:06 crc kubenswrapper[4921]: I0318 13:48:06.024981 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" event={"ID":"97dab222-fe06-4654-bf9a-49f3790bbebf","Type":"ContainerDied","Data":"55d692c51e0ba2cc468f1e360ebe908048f59e6d06506ed7a481b76bd3b52095"} Mar 18 13:48:06 crc kubenswrapper[4921]: I0318 13:48:06.025371 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55d692c51e0ba2cc468f1e360ebe908048f59e6d06506ed7a481b76bd3b52095" Mar 18 13:48:06 crc kubenswrapper[4921]: I0318 13:48:06.025450 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-ddjgn" Mar 18 13:48:06 crc kubenswrapper[4921]: I0318 13:48:06.488212 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-mq4z7"] Mar 18 13:48:06 crc kubenswrapper[4921]: I0318 13:48:06.491974 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-mq4z7"] Mar 18 13:48:07 crc kubenswrapper[4921]: I0318 13:48:07.220569 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eac136f-4ca2-44af-96a1-b20df93c0e01" path="/var/lib/kubelet/pods/5eac136f-4ca2-44af-96a1-b20df93c0e01/volumes" Mar 18 13:48:09 crc kubenswrapper[4921]: I0318 13:48:09.277634 4921 scope.go:117] "RemoveContainer" containerID="508153d08c9bb79926a7c188df478eb25849fcbb4c0a01573ed7f8d01bb9fb3d" Mar 18 13:48:09 crc kubenswrapper[4921]: I0318 13:48:09.302438 4921 scope.go:117] "RemoveContainer" containerID="0ab39b001b0e58bd1a2c39824bc72c555b0fa8a0bacb3c420aad6bbfbcbc64f7" Mar 18 13:48:09 crc kubenswrapper[4921]: I0318 13:48:09.353997 4921 scope.go:117] "RemoveContainer" containerID="6dbc05c37c241847136410f87a21b5c86e26194e2516942ad1998f466e1dfd5c" Mar 18 13:48:15 crc kubenswrapper[4921]: I0318 13:48:15.209402 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:48:15 crc kubenswrapper[4921]: E0318 13:48:15.210302 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:48:28 crc kubenswrapper[4921]: I0318 13:48:28.209643 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:48:28 crc kubenswrapper[4921]: E0318 13:48:28.210628 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:48:39 crc kubenswrapper[4921]: I0318 13:48:39.210171 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:48:39 crc kubenswrapper[4921]: E0318 13:48:39.211087 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:48:52 crc kubenswrapper[4921]: I0318 13:48:52.208954 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:48:52 crc kubenswrapper[4921]: E0318 13:48:52.210944 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:49:00 crc kubenswrapper[4921]: I0318 13:49:00.037994 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bclj2"] Mar 18 13:49:00 crc kubenswrapper[4921]: I0318 13:49:00.050487 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bclj2"] Mar 18 13:49:01 crc kubenswrapper[4921]: I0318 13:49:01.219059 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7b3690-bd67-4875-bacb-5bfb5ff7c161" path="/var/lib/kubelet/pods/fc7b3690-bd67-4875-bacb-5bfb5ff7c161/volumes" Mar 18 13:49:02 crc kubenswrapper[4921]: I0318 13:49:02.031619 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2ee0-account-create-update-fg6v2"] Mar 18 13:49:02 crc kubenswrapper[4921]: I0318 13:49:02.042911 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2ee0-account-create-update-fg6v2"] Mar 18 13:49:03 crc kubenswrapper[4921]: I0318 13:49:03.220281 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ecdf7f2-9bc5-494d-b67e-0bccbac89401" path="/var/lib/kubelet/pods/7ecdf7f2-9bc5-494d-b67e-0bccbac89401/volumes" Mar 18 13:49:04 crc kubenswrapper[4921]: I0318 13:49:04.208909 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:49:04 crc kubenswrapper[4921]: E0318 13:49:04.209476 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.909871 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-p4s82"] Mar 18 13:49:05 crc kubenswrapper[4921]: E0318 13:49:05.910506 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97dab222-fe06-4654-bf9a-49f3790bbebf" containerName="oc" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.910532 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="97dab222-fe06-4654-bf9a-49f3790bbebf" containerName="oc" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.910796 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="97dab222-fe06-4654-bf9a-49f3790bbebf" containerName="oc" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.911716 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4s82" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.913937 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rm2jk" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.913988 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.925490 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-757l2"] Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.928085 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.941608 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4s82"] Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988365 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzwc8\" (UniqueName: \"kubernetes.io/projected/f54f1467-48d2-424f-b694-485a89daea5d-kube-api-access-xzwc8\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988534 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-run\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988658 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-log\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988691 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-run-ovn\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-etc-ovs\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54f1467-48d2-424f-b694-485a89daea5d-scripts\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988863 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvmm\" (UniqueName: \"kubernetes.io/projected/3ed65908-7baa-4263-adf8-14055c9fe856-kube-api-access-dcvmm\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988912 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-lib\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-log-ovn\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.988980 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-run\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:05 crc kubenswrapper[4921]: I0318 13:49:05.989237 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed65908-7baa-4263-adf8-14055c9fe856-scripts\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.000981 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-757l2"] Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091334 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-run\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091611 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-log\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091724 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-run-ovn\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091766 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-etc-ovs\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091809 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54f1467-48d2-424f-b694-485a89daea5d-scripts\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091876 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-run\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091888 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvmm\" (UniqueName: \"kubernetes.io/projected/3ed65908-7baa-4263-adf8-14055c9fe856-kube-api-access-dcvmm\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091972 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-etc-ovs\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091992 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-lib\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.091887 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-log\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092031 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-run-ovn\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092052 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-lib\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092048 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-log-ovn\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092080 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-run\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092100 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed65908-7baa-4263-adf8-14055c9fe856-var-log-ovn\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54f1467-48d2-424f-b694-485a89daea5d-var-run\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092165 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed65908-7baa-4263-adf8-14055c9fe856-scripts\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.092230 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzwc8\" (UniqueName: \"kubernetes.io/projected/f54f1467-48d2-424f-b694-485a89daea5d-kube-api-access-xzwc8\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.094812 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54f1467-48d2-424f-b694-485a89daea5d-scripts\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.095173 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed65908-7baa-4263-adf8-14055c9fe856-scripts\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.117012 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzwc8\" (UniqueName: \"kubernetes.io/projected/f54f1467-48d2-424f-b694-485a89daea5d-kube-api-access-xzwc8\") pod \"ovn-controller-ovs-757l2\" (UID: \"f54f1467-48d2-424f-b694-485a89daea5d\") " pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.117865 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvmm\" (UniqueName: \"kubernetes.io/projected/3ed65908-7baa-4263-adf8-14055c9fe856-kube-api-access-dcvmm\") pod \"ovn-controller-p4s82\" (UID: \"3ed65908-7baa-4263-adf8-14055c9fe856\") " pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.311633 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4s82" Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.320526 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:06 crc kubenswrapper[4921]: W0318 13:49:06.896260 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed65908_7baa_4263_adf8_14055c9fe856.slice/crio-1beedd36f28e082c79fc1c6802c5894d3c1393db01162bd14d619788d9b96497 WatchSource:0}: Error finding container 1beedd36f28e082c79fc1c6802c5894d3c1393db01162bd14d619788d9b96497: Status 404 returned error can't find the container with id 1beedd36f28e082c79fc1c6802c5894d3c1393db01162bd14d619788d9b96497 Mar 18 13:49:06 crc kubenswrapper[4921]: I0318 13:49:06.899629 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4s82"] Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.296679 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-757l2"] Mar 18 13:49:07 crc kubenswrapper[4921]: W0318 13:49:07.304749 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf54f1467_48d2_424f_b694_485a89daea5d.slice/crio-093eca787982866b37638a9a491d67ec80613d279707fd86301b9e1f217d0a60 WatchSource:0}: Error finding container 093eca787982866b37638a9a491d67ec80613d279707fd86301b9e1f217d0a60: Status 404 returned error can't find the container with id 093eca787982866b37638a9a491d67ec80613d279707fd86301b9e1f217d0a60 Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.569855 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4s82" event={"ID":"3ed65908-7baa-4263-adf8-14055c9fe856","Type":"ContainerStarted","Data":"ab4dd609a3f1b91e0bfe6d2b7ae6d86ffa36eb987b7542741f9603ee947815cf"} Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.569931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4s82" event={"ID":"3ed65908-7baa-4263-adf8-14055c9fe856","Type":"ContainerStarted","Data":"1beedd36f28e082c79fc1c6802c5894d3c1393db01162bd14d619788d9b96497"} Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.569961 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-p4s82" Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.575948 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-757l2" event={"ID":"f54f1467-48d2-424f-b694-485a89daea5d","Type":"ContainerStarted","Data":"daf81d19db581b258cf9f9df168c9649fe54d31e6fe6f08e88a4817794085200"} Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.575983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-757l2" event={"ID":"f54f1467-48d2-424f-b694-485a89daea5d","Type":"ContainerStarted","Data":"093eca787982866b37638a9a491d67ec80613d279707fd86301b9e1f217d0a60"} Mar 18 13:49:07 crc kubenswrapper[4921]: I0318 13:49:07.597363 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-p4s82" podStartSLOduration=2.597331661 podStartE2EDuration="2.597331661s" podCreationTimestamp="2026-03-18 13:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:49:07.591067552 +0000 UTC m=+5967.140988181" watchObservedRunningTime="2026-03-18 13:49:07.597331661 +0000 UTC m=+5967.147252310" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.554326 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8mwqw"] Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.560974 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.563300 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.570299 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8mwqw"] Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.588186 4921 generic.go:334] "Generic (PLEG): container finished" podID="f54f1467-48d2-424f-b694-485a89daea5d" containerID="daf81d19db581b258cf9f9df168c9649fe54d31e6fe6f08e88a4817794085200" exitCode=0 Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.588287 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-757l2" event={"ID":"f54f1467-48d2-424f-b694-485a89daea5d","Type":"ContainerDied","Data":"daf81d19db581b258cf9f9df168c9649fe54d31e6fe6f08e88a4817794085200"} Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.748037 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e532c044-5e87-4338-a3d2-dd43379c2ba8-config\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.748183 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e532c044-5e87-4338-a3d2-dd43379c2ba8-ovs-rundir\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.748220 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshb7\" (UniqueName: \"kubernetes.io/projected/e532c044-5e87-4338-a3d2-dd43379c2ba8-kube-api-access-jshb7\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.748255 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e532c044-5e87-4338-a3d2-dd43379c2ba8-ovn-rundir\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.854105 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e532c044-5e87-4338-a3d2-dd43379c2ba8-ovs-rundir\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.854201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jshb7\" (UniqueName: \"kubernetes.io/projected/e532c044-5e87-4338-a3d2-dd43379c2ba8-kube-api-access-jshb7\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.854255 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e532c044-5e87-4338-a3d2-dd43379c2ba8-ovn-rundir\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.854628 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e532c044-5e87-4338-a3d2-dd43379c2ba8-ovs-rundir\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.854644 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e532c044-5e87-4338-a3d2-dd43379c2ba8-ovn-rundir\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.854832 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e532c044-5e87-4338-a3d2-dd43379c2ba8-config\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.856933 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e532c044-5e87-4338-a3d2-dd43379c2ba8-config\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.875659 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshb7\" (UniqueName: \"kubernetes.io/projected/e532c044-5e87-4338-a3d2-dd43379c2ba8-kube-api-access-jshb7\") pod \"ovn-controller-metrics-8mwqw\" (UID: \"e532c044-5e87-4338-a3d2-dd43379c2ba8\") " pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:08 crc kubenswrapper[4921]: I0318 13:49:08.888926 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8mwqw" Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.350556 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8mwqw"] Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.512394 4921 scope.go:117] "RemoveContainer" containerID="56a63e99653a80a876b71fa71aca2dd9d05733692b7ba92d42b45b6e79149cc9" Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.552662 4921 scope.go:117] "RemoveContainer" containerID="83be224397eb769822f6c508db1bb8357cf611c078710aa05d490bf6edfc1b5d" Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.602541 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8mwqw" event={"ID":"e532c044-5e87-4338-a3d2-dd43379c2ba8","Type":"ContainerStarted","Data":"9913ee9940a14a10fe3d96a0ee2b7510346f09a1f98f9752f3481823b79e9726"} Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.606714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-757l2" event={"ID":"f54f1467-48d2-424f-b694-485a89daea5d","Type":"ContainerStarted","Data":"7e7bc3c33356fa7d45084d98515cd9b993d5541f97f2a61eb46818c496ba1b22"} Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.606758 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-757l2" event={"ID":"f54f1467-48d2-424f-b694-485a89daea5d","Type":"ContainerStarted","Data":"a1c4dffc74e9c41116add20be0c2548d0675ce83bc225f14bb4e8b31ce1461ed"} Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.607158 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:09 crc kubenswrapper[4921]: I0318 13:49:09.631297 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-757l2" podStartSLOduration=4.631280637 podStartE2EDuration="4.631280637s" podCreationTimestamp="2026-03-18 13:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:49:09.624288807 +0000 UTC m=+5969.174209456" watchObservedRunningTime="2026-03-18 13:49:09.631280637 +0000 UTC m=+5969.181201276" Mar 18 13:49:10 crc kubenswrapper[4921]: I0318 13:49:10.624167 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8mwqw" event={"ID":"e532c044-5e87-4338-a3d2-dd43379c2ba8","Type":"ContainerStarted","Data":"4c97ee95c27bbd3bc381711db35c9a8c1dfd48b6c9ed1717706b498b5e4b5972"} Mar 18 13:49:10 crc kubenswrapper[4921]: I0318 13:49:10.624719 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:10 crc kubenswrapper[4921]: I0318 13:49:10.686520 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8mwqw" podStartSLOduration=2.686492821 podStartE2EDuration="2.686492821s" podCreationTimestamp="2026-03-18 13:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:49:10.663973798 +0000 UTC m=+5970.213894447" watchObservedRunningTime="2026-03-18 13:49:10.686492821 +0000 UTC m=+5970.236413460" Mar 18 13:49:11 crc kubenswrapper[4921]: I0318 13:49:11.041509 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jcl92"] Mar 18 13:49:11 crc kubenswrapper[4921]: I0318 13:49:11.050649 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jcl92"] Mar 18 13:49:11 crc kubenswrapper[4921]: I0318 13:49:11.219555 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72391d03-9818-4c13-8327-d63607cd54fa" path="/var/lib/kubelet/pods/72391d03-9818-4c13-8327-d63607cd54fa/volumes" Mar 18 13:49:18 crc kubenswrapper[4921]: I0318 13:49:18.209392 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:49:18 crc kubenswrapper[4921]: I0318 13:49:18.711027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"038dd07b701a6099240a4d7ebb53358644804af29a553533cae1d5837f2d8758"} Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.418472 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-ct24h"] Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.421208 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.431358 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-ct24h"] Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.618010 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-operator-scripts\") pod \"octavia-db-create-ct24h\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.618146 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fr97\" (UniqueName: \"kubernetes.io/projected/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-kube-api-access-7fr97\") pod \"octavia-db-create-ct24h\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.719795 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-operator-scripts\") pod \"octavia-db-create-ct24h\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.719893 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fr97\" (UniqueName: \"kubernetes.io/projected/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-kube-api-access-7fr97\") pod \"octavia-db-create-ct24h\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.720963 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-operator-scripts\") pod \"octavia-db-create-ct24h\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.748219 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fr97\" (UniqueName: \"kubernetes.io/projected/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-kube-api-access-7fr97\") pod \"octavia-db-create-ct24h\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:20 crc kubenswrapper[4921]: I0318 13:49:20.754738 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.204660 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-ct24h"] Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.458537 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-2dd7-account-create-update-wmtpf"] Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.459891 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.462345 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.470602 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2dd7-account-create-update-wmtpf"] Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.544874 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhp7\" (UniqueName: \"kubernetes.io/projected/361d2424-db33-40cc-bc8e-0689aed6db2e-kube-api-access-9jhp7\") pod \"octavia-2dd7-account-create-update-wmtpf\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.545289 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/361d2424-db33-40cc-bc8e-0689aed6db2e-operator-scripts\") pod \"octavia-2dd7-account-create-update-wmtpf\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.647519 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhp7\" (UniqueName: \"kubernetes.io/projected/361d2424-db33-40cc-bc8e-0689aed6db2e-kube-api-access-9jhp7\") pod \"octavia-2dd7-account-create-update-wmtpf\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.647640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/361d2424-db33-40cc-bc8e-0689aed6db2e-operator-scripts\") pod \"octavia-2dd7-account-create-update-wmtpf\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.648390 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/361d2424-db33-40cc-bc8e-0689aed6db2e-operator-scripts\") pod \"octavia-2dd7-account-create-update-wmtpf\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.670171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhp7\" (UniqueName: \"kubernetes.io/projected/361d2424-db33-40cc-bc8e-0689aed6db2e-kube-api-access-9jhp7\") pod \"octavia-2dd7-account-create-update-wmtpf\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.738440 4921 generic.go:334] "Generic (PLEG): container finished" podID="d0ea264c-1a44-4eb8-b2f5-62542c7865e0" containerID="bdec4afe195b74f79c6942082d4eb4b734b92793c6803d8ebce791ac52bfb0c7" exitCode=0 Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.738490 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-ct24h" event={"ID":"d0ea264c-1a44-4eb8-b2f5-62542c7865e0","Type":"ContainerDied","Data":"bdec4afe195b74f79c6942082d4eb4b734b92793c6803d8ebce791ac52bfb0c7"} Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.738526 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-ct24h" event={"ID":"d0ea264c-1a44-4eb8-b2f5-62542c7865e0","Type":"ContainerStarted","Data":"c5b1ce0a0574590087e56c5e84e6d95de975a355d335e078a2cd40ebf305077a"} Mar 18 13:49:21 crc kubenswrapper[4921]: I0318 13:49:21.784994 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:22 crc kubenswrapper[4921]: I0318 13:49:22.219746 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2dd7-account-create-update-wmtpf"] Mar 18 13:49:22 crc kubenswrapper[4921]: W0318 13:49:22.220359 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361d2424_db33_40cc_bc8e_0689aed6db2e.slice/crio-b08bf1abf1f0d09bb127ca7eda00ddd4f3972b60d9f0f113a50bc1cd360cae0a WatchSource:0}: Error finding container b08bf1abf1f0d09bb127ca7eda00ddd4f3972b60d9f0f113a50bc1cd360cae0a: Status 404 returned error can't find the container with id b08bf1abf1f0d09bb127ca7eda00ddd4f3972b60d9f0f113a50bc1cd360cae0a Mar 18 13:49:22 crc kubenswrapper[4921]: I0318 13:49:22.750992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2dd7-account-create-update-wmtpf" event={"ID":"361d2424-db33-40cc-bc8e-0689aed6db2e","Type":"ContainerStarted","Data":"5b5517d29514e37f774ecafd1a20150e21555e8f525b2df22cebcabb3ee45be5"} Mar 18 13:49:22 crc kubenswrapper[4921]: I0318 13:49:22.751349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2dd7-account-create-update-wmtpf" event={"ID":"361d2424-db33-40cc-bc8e-0689aed6db2e","Type":"ContainerStarted","Data":"b08bf1abf1f0d09bb127ca7eda00ddd4f3972b60d9f0f113a50bc1cd360cae0a"} Mar 18 13:49:22 crc kubenswrapper[4921]: I0318 13:49:22.768582 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-2dd7-account-create-update-wmtpf" podStartSLOduration=1.7685643359999998 podStartE2EDuration="1.768564336s" podCreationTimestamp="2026-03-18 13:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:49:22.763957024 +0000 UTC m=+5982.313877673" watchObservedRunningTime="2026-03-18 13:49:22.768564336 +0000 UTC m=+5982.318484975" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.102643 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.276883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-operator-scripts\") pod \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.277538 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fr97\" (UniqueName: \"kubernetes.io/projected/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-kube-api-access-7fr97\") pod \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\" (UID: \"d0ea264c-1a44-4eb8-b2f5-62542c7865e0\") " Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.278530 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0ea264c-1a44-4eb8-b2f5-62542c7865e0" (UID: "d0ea264c-1a44-4eb8-b2f5-62542c7865e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.284586 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-kube-api-access-7fr97" (OuterVolumeSpecName: "kube-api-access-7fr97") pod "d0ea264c-1a44-4eb8-b2f5-62542c7865e0" (UID: "d0ea264c-1a44-4eb8-b2f5-62542c7865e0"). InnerVolumeSpecName "kube-api-access-7fr97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.380002 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.380039 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fr97\" (UniqueName: \"kubernetes.io/projected/d0ea264c-1a44-4eb8-b2f5-62542c7865e0-kube-api-access-7fr97\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.761079 4921 generic.go:334] "Generic (PLEG): container finished" podID="361d2424-db33-40cc-bc8e-0689aed6db2e" containerID="5b5517d29514e37f774ecafd1a20150e21555e8f525b2df22cebcabb3ee45be5" exitCode=0 Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.761162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2dd7-account-create-update-wmtpf" event={"ID":"361d2424-db33-40cc-bc8e-0689aed6db2e","Type":"ContainerDied","Data":"5b5517d29514e37f774ecafd1a20150e21555e8f525b2df22cebcabb3ee45be5"} Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.763016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-ct24h" event={"ID":"d0ea264c-1a44-4eb8-b2f5-62542c7865e0","Type":"ContainerDied","Data":"c5b1ce0a0574590087e56c5e84e6d95de975a355d335e078a2cd40ebf305077a"} Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.763062 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-ct24h" Mar 18 13:49:23 crc kubenswrapper[4921]: I0318 13:49:23.763083 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5b1ce0a0574590087e56c5e84e6d95de975a355d335e078a2cd40ebf305077a" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.049380 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-54nxb"] Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.060230 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-54nxb"] Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.181795 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.259447 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68984fe-0046-43ea-b5f1-2809a85ff847" path="/var/lib/kubelet/pods/a68984fe-0046-43ea-b5f1-2809a85ff847/volumes" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.337099 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/361d2424-db33-40cc-bc8e-0689aed6db2e-operator-scripts\") pod \"361d2424-db33-40cc-bc8e-0689aed6db2e\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.337624 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jhp7\" (UniqueName: \"kubernetes.io/projected/361d2424-db33-40cc-bc8e-0689aed6db2e-kube-api-access-9jhp7\") pod \"361d2424-db33-40cc-bc8e-0689aed6db2e\" (UID: \"361d2424-db33-40cc-bc8e-0689aed6db2e\") " Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.337863 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361d2424-db33-40cc-bc8e-0689aed6db2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "361d2424-db33-40cc-bc8e-0689aed6db2e" (UID: "361d2424-db33-40cc-bc8e-0689aed6db2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.338099 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/361d2424-db33-40cc-bc8e-0689aed6db2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.363558 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361d2424-db33-40cc-bc8e-0689aed6db2e-kube-api-access-9jhp7" (OuterVolumeSpecName: "kube-api-access-9jhp7") pod "361d2424-db33-40cc-bc8e-0689aed6db2e" (UID: "361d2424-db33-40cc-bc8e-0689aed6db2e"). InnerVolumeSpecName "kube-api-access-9jhp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.439735 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jhp7\" (UniqueName: \"kubernetes.io/projected/361d2424-db33-40cc-bc8e-0689aed6db2e-kube-api-access-9jhp7\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.786811 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2dd7-account-create-update-wmtpf" event={"ID":"361d2424-db33-40cc-bc8e-0689aed6db2e","Type":"ContainerDied","Data":"b08bf1abf1f0d09bb127ca7eda00ddd4f3972b60d9f0f113a50bc1cd360cae0a"} Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.786853 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08bf1abf1f0d09bb127ca7eda00ddd4f3972b60d9f0f113a50bc1cd360cae0a" Mar 18 13:49:25 crc kubenswrapper[4921]: I0318 13:49:25.786910 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2dd7-account-create-update-wmtpf" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.897039 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-6gcjk"] Mar 18 13:49:26 crc kubenswrapper[4921]: E0318 13:49:26.897658 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361d2424-db33-40cc-bc8e-0689aed6db2e" containerName="mariadb-account-create-update" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.897670 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="361d2424-db33-40cc-bc8e-0689aed6db2e" containerName="mariadb-account-create-update" Mar 18 13:49:26 crc kubenswrapper[4921]: E0318 13:49:26.897686 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ea264c-1a44-4eb8-b2f5-62542c7865e0" containerName="mariadb-database-create" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.897692 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ea264c-1a44-4eb8-b2f5-62542c7865e0" containerName="mariadb-database-create" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.897859 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="361d2424-db33-40cc-bc8e-0689aed6db2e" containerName="mariadb-account-create-update" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.897891 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ea264c-1a44-4eb8-b2f5-62542c7865e0" containerName="mariadb-database-create" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.898485 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.917075 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-6gcjk"] Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.970772 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbm82\" (UniqueName: \"kubernetes.io/projected/5b12096d-0e60-4524-bfa7-34a2512b7292-kube-api-access-gbm82\") pod \"octavia-persistence-db-create-6gcjk\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:26 crc kubenswrapper[4921]: I0318 13:49:26.971066 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b12096d-0e60-4524-bfa7-34a2512b7292-operator-scripts\") pod \"octavia-persistence-db-create-6gcjk\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.072809 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbm82\" (UniqueName: \"kubernetes.io/projected/5b12096d-0e60-4524-bfa7-34a2512b7292-kube-api-access-gbm82\") pod \"octavia-persistence-db-create-6gcjk\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.072920 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b12096d-0e60-4524-bfa7-34a2512b7292-operator-scripts\") pod \"octavia-persistence-db-create-6gcjk\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.073705 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b12096d-0e60-4524-bfa7-34a2512b7292-operator-scripts\") pod \"octavia-persistence-db-create-6gcjk\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.093438 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbm82\" (UniqueName: \"kubernetes.io/projected/5b12096d-0e60-4524-bfa7-34a2512b7292-kube-api-access-gbm82\") pod \"octavia-persistence-db-create-6gcjk\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.220561 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.402696 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-a510-account-create-update-h8fgv"] Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.406061 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.408714 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.422716 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-a510-account-create-update-h8fgv"] Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.481497 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992pw\" (UniqueName: \"kubernetes.io/projected/7aa945ee-8749-453d-8d15-fc2f94c7877f-kube-api-access-992pw\") pod \"octavia-a510-account-create-update-h8fgv\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.481603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa945ee-8749-453d-8d15-fc2f94c7877f-operator-scripts\") pod \"octavia-a510-account-create-update-h8fgv\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.582798 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa945ee-8749-453d-8d15-fc2f94c7877f-operator-scripts\") pod \"octavia-a510-account-create-update-h8fgv\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.582899 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-992pw\" (UniqueName: \"kubernetes.io/projected/7aa945ee-8749-453d-8d15-fc2f94c7877f-kube-api-access-992pw\") pod \"octavia-a510-account-create-update-h8fgv\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.583600 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa945ee-8749-453d-8d15-fc2f94c7877f-operator-scripts\") pod \"octavia-a510-account-create-update-h8fgv\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.599508 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-992pw\" (UniqueName: \"kubernetes.io/projected/7aa945ee-8749-453d-8d15-fc2f94c7877f-kube-api-access-992pw\") pod \"octavia-a510-account-create-update-h8fgv\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: W0318 13:49:27.717482 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b12096d_0e60_4524_bfa7_34a2512b7292.slice/crio-1ec542f9df755611f33b84f06209f5fb9f78ef02c5910b81b3fa59a04d6e55b0 WatchSource:0}: Error finding container 1ec542f9df755611f33b84f06209f5fb9f78ef02c5910b81b3fa59a04d6e55b0: Status 404 returned error can't find the container with id 1ec542f9df755611f33b84f06209f5fb9f78ef02c5910b81b3fa59a04d6e55b0 Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.718860 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-6gcjk"] Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.730566 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:27 crc kubenswrapper[4921]: I0318 13:49:27.805478 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-6gcjk" event={"ID":"5b12096d-0e60-4524-bfa7-34a2512b7292","Type":"ContainerStarted","Data":"1ec542f9df755611f33b84f06209f5fb9f78ef02c5910b81b3fa59a04d6e55b0"} Mar 18 13:49:28 crc kubenswrapper[4921]: I0318 13:49:28.203064 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-a510-account-create-update-h8fgv"] Mar 18 13:49:28 crc kubenswrapper[4921]: I0318 13:49:28.821891 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a510-account-create-update-h8fgv" event={"ID":"7aa945ee-8749-453d-8d15-fc2f94c7877f","Type":"ContainerStarted","Data":"93c3bdf7f29dc0f99cc6d903b03be0f521e595f4eb839ed21c91416a85ca6c06"} Mar 18 13:49:28 crc kubenswrapper[4921]: I0318 13:49:28.822420 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a510-account-create-update-h8fgv" event={"ID":"7aa945ee-8749-453d-8d15-fc2f94c7877f","Type":"ContainerStarted","Data":"d7aaffe545f8345d7442d2457fa33e07a6cc4a4ee836f81a81a76cc294ce1004"} Mar 18 13:49:28 crc kubenswrapper[4921]: I0318 13:49:28.825358 4921 generic.go:334] "Generic (PLEG): container finished" podID="5b12096d-0e60-4524-bfa7-34a2512b7292" containerID="69bf16f18bc38d9fb59da4269c36da94fb68071a9ff700ccb99360fabc907c6b" exitCode=0 Mar 18 13:49:28 crc kubenswrapper[4921]: I0318 13:49:28.825415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-6gcjk" event={"ID":"5b12096d-0e60-4524-bfa7-34a2512b7292","Type":"ContainerDied","Data":"69bf16f18bc38d9fb59da4269c36da94fb68071a9ff700ccb99360fabc907c6b"} Mar 18 13:49:28 crc kubenswrapper[4921]: I0318 13:49:28.856989 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-a510-account-create-update-h8fgv" podStartSLOduration=1.85696933 podStartE2EDuration="1.85696933s" podCreationTimestamp="2026-03-18 13:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:49:28.847444048 +0000 UTC m=+5988.397364727" watchObservedRunningTime="2026-03-18 13:49:28.85696933 +0000 UTC m=+5988.406889959" Mar 18 13:49:29 crc kubenswrapper[4921]: I0318 13:49:29.838260 4921 generic.go:334] "Generic (PLEG): container finished" podID="7aa945ee-8749-453d-8d15-fc2f94c7877f" containerID="93c3bdf7f29dc0f99cc6d903b03be0f521e595f4eb839ed21c91416a85ca6c06" exitCode=0 Mar 18 13:49:29 crc kubenswrapper[4921]: I0318 13:49:29.838321 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a510-account-create-update-h8fgv" event={"ID":"7aa945ee-8749-453d-8d15-fc2f94c7877f","Type":"ContainerDied","Data":"93c3bdf7f29dc0f99cc6d903b03be0f521e595f4eb839ed21c91416a85ca6c06"} Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.210737 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.338652 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b12096d-0e60-4524-bfa7-34a2512b7292-operator-scripts\") pod \"5b12096d-0e60-4524-bfa7-34a2512b7292\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.339087 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbm82\" (UniqueName: \"kubernetes.io/projected/5b12096d-0e60-4524-bfa7-34a2512b7292-kube-api-access-gbm82\") pod \"5b12096d-0e60-4524-bfa7-34a2512b7292\" (UID: \"5b12096d-0e60-4524-bfa7-34a2512b7292\") " Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.339230 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b12096d-0e60-4524-bfa7-34a2512b7292-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b12096d-0e60-4524-bfa7-34a2512b7292" (UID: "5b12096d-0e60-4524-bfa7-34a2512b7292"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.339862 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b12096d-0e60-4524-bfa7-34a2512b7292-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.345890 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b12096d-0e60-4524-bfa7-34a2512b7292-kube-api-access-gbm82" (OuterVolumeSpecName: "kube-api-access-gbm82") pod "5b12096d-0e60-4524-bfa7-34a2512b7292" (UID: "5b12096d-0e60-4524-bfa7-34a2512b7292"). InnerVolumeSpecName "kube-api-access-gbm82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.442033 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbm82\" (UniqueName: \"kubernetes.io/projected/5b12096d-0e60-4524-bfa7-34a2512b7292-kube-api-access-gbm82\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.851402 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6gcjk" Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.851486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-6gcjk" event={"ID":"5b12096d-0e60-4524-bfa7-34a2512b7292","Type":"ContainerDied","Data":"1ec542f9df755611f33b84f06209f5fb9f78ef02c5910b81b3fa59a04d6e55b0"} Mar 18 13:49:30 crc kubenswrapper[4921]: I0318 13:49:30.851790 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ec542f9df755611f33b84f06209f5fb9f78ef02c5910b81b3fa59a04d6e55b0" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.203764 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.359276 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-992pw\" (UniqueName: \"kubernetes.io/projected/7aa945ee-8749-453d-8d15-fc2f94c7877f-kube-api-access-992pw\") pod \"7aa945ee-8749-453d-8d15-fc2f94c7877f\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.359432 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa945ee-8749-453d-8d15-fc2f94c7877f-operator-scripts\") pod \"7aa945ee-8749-453d-8d15-fc2f94c7877f\" (UID: \"7aa945ee-8749-453d-8d15-fc2f94c7877f\") " Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.360412 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aa945ee-8749-453d-8d15-fc2f94c7877f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7aa945ee-8749-453d-8d15-fc2f94c7877f" (UID: "7aa945ee-8749-453d-8d15-fc2f94c7877f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.364287 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa945ee-8749-453d-8d15-fc2f94c7877f-kube-api-access-992pw" (OuterVolumeSpecName: "kube-api-access-992pw") pod "7aa945ee-8749-453d-8d15-fc2f94c7877f" (UID: "7aa945ee-8749-453d-8d15-fc2f94c7877f"). InnerVolumeSpecName "kube-api-access-992pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.462019 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-992pw\" (UniqueName: \"kubernetes.io/projected/7aa945ee-8749-453d-8d15-fc2f94c7877f-kube-api-access-992pw\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.462582 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aa945ee-8749-453d-8d15-fc2f94c7877f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.863378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a510-account-create-update-h8fgv" event={"ID":"7aa945ee-8749-453d-8d15-fc2f94c7877f","Type":"ContainerDied","Data":"d7aaffe545f8345d7442d2457fa33e07a6cc4a4ee836f81a81a76cc294ce1004"} Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.864327 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7aaffe545f8345d7442d2457fa33e07a6cc4a4ee836f81a81a76cc294ce1004" Mar 18 13:49:31 crc kubenswrapper[4921]: I0318 13:49:31.863422 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a510-account-create-update-h8fgv" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.137564 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7495847675-nfrgn"] Mar 18 13:49:33 crc kubenswrapper[4921]: E0318 13:49:33.138906 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b12096d-0e60-4524-bfa7-34a2512b7292" containerName="mariadb-database-create" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.139014 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b12096d-0e60-4524-bfa7-34a2512b7292" containerName="mariadb-database-create" Mar 18 13:49:33 crc kubenswrapper[4921]: E0318 13:49:33.139096 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa945ee-8749-453d-8d15-fc2f94c7877f" containerName="mariadb-account-create-update" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.139184 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa945ee-8749-453d-8d15-fc2f94c7877f" containerName="mariadb-account-create-update" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.139504 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b12096d-0e60-4524-bfa7-34a2512b7292" containerName="mariadb-database-create" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.139600 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa945ee-8749-453d-8d15-fc2f94c7877f" containerName="mariadb-account-create-update" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.141060 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.144781 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.145055 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-8mnbm" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.146020 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.148301 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7495847675-nfrgn"] Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.312482 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-scripts\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.312576 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-config-data\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.312794 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-combined-ca-bundle\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.312967 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-octavia-run\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.313559 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-config-data-merged\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.414917 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-octavia-run\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.415006 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-config-data-merged\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.415069 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-scripts\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.415104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-config-data\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.415179 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-combined-ca-bundle\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.415515 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-octavia-run\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.415787 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-config-data-merged\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.420921 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-scripts\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.420958 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-config-data\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.421196 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b144d4-c8fe-40b4-ae86-fce2a83da57b-combined-ca-bundle\") pod \"octavia-api-7495847675-nfrgn\" (UID: \"d2b144d4-c8fe-40b4-ae86-fce2a83da57b\") " pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:33 crc kubenswrapper[4921]: I0318 13:49:33.460104 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:34 crc kubenswrapper[4921]: I0318 13:49:34.109992 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7495847675-nfrgn"] Mar 18 13:49:34 crc kubenswrapper[4921]: I0318 13:49:34.891677 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7495847675-nfrgn" event={"ID":"d2b144d4-c8fe-40b4-ae86-fce2a83da57b","Type":"ContainerStarted","Data":"49f17191f289d57a6fa4d6a94ed6e0cb4cc7d432981f906e7222e8994974f2b0"} Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.398459 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-p4s82" podUID="3ed65908-7baa-4263-adf8-14055c9fe856" containerName="ovn-controller" probeResult="failure" output=< Mar 18 13:49:41 crc kubenswrapper[4921]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 13:49:41 crc kubenswrapper[4921]: > Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.408992 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.410840 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-757l2" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.538085 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-p4s82-config-6m5z5"] Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.539487 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.543027 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.575695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4s82-config-6m5z5"] Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.637082 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-log-ovn\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.637167 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-scripts\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.637221 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwv4p\" (UniqueName: \"kubernetes.io/projected/d02d7675-a4ea-46de-8334-3d035b85c05c-kube-api-access-vwv4p\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.637296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-additional-scripts\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.637322 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.637340 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run-ovn\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.738309 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwv4p\" (UniqueName: \"kubernetes.io/projected/d02d7675-a4ea-46de-8334-3d035b85c05c-kube-api-access-vwv4p\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.738668 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-additional-scripts\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.738780 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.738875 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run-ovn\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.739040 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-log-ovn\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.739188 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-scripts\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.739200 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.739209 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-log-ovn\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.739208 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run-ovn\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.739970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-additional-scripts\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.741188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-scripts\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.760354 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwv4p\" (UniqueName: \"kubernetes.io/projected/d02d7675-a4ea-46de-8334-3d035b85c05c-kube-api-access-vwv4p\") pod \"ovn-controller-p4s82-config-6m5z5\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:41 crc kubenswrapper[4921]: I0318 13:49:41.869618 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:44 crc kubenswrapper[4921]: I0318 13:49:44.645082 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4s82-config-6m5z5"] Mar 18 13:49:45 crc kubenswrapper[4921]: I0318 13:49:44.989999 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4s82-config-6m5z5" event={"ID":"d02d7675-a4ea-46de-8334-3d035b85c05c","Type":"ContainerStarted","Data":"62b849949f940a3874e26e5f7b3d5ebd9b5f2ac08f839c1ec1399e9d6ef5c94f"} Mar 18 13:49:45 crc kubenswrapper[4921]: I0318 13:49:44.990050 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4s82-config-6m5z5" event={"ID":"d02d7675-a4ea-46de-8334-3d035b85c05c","Type":"ContainerStarted","Data":"2df14e33275eacb65e9a6060bd44957d10ce260e57a6cb1cf9b20be77aae076f"} Mar 18 13:49:45 crc kubenswrapper[4921]: I0318 13:49:44.993567 4921 generic.go:334] "Generic (PLEG): container finished" podID="d2b144d4-c8fe-40b4-ae86-fce2a83da57b" containerID="cc674449c2e8441b8311e54958f1353d59d41c5f89e0e2ac373003eb51abdd28" exitCode=0 Mar 18 13:49:45 crc kubenswrapper[4921]: I0318 13:49:44.993613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7495847675-nfrgn" event={"ID":"d2b144d4-c8fe-40b4-ae86-fce2a83da57b","Type":"ContainerDied","Data":"cc674449c2e8441b8311e54958f1353d59d41c5f89e0e2ac373003eb51abdd28"} Mar 18 13:49:45 crc kubenswrapper[4921]: I0318 13:49:45.017708 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-p4s82-config-6m5z5" podStartSLOduration=4.017685433 podStartE2EDuration="4.017685433s" podCreationTimestamp="2026-03-18 13:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:49:45.009687865 +0000 UTC m=+6004.559608504" watchObservedRunningTime="2026-03-18 13:49:45.017685433 +0000 UTC m=+6004.567606062" Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.005709 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7495847675-nfrgn" event={"ID":"d2b144d4-c8fe-40b4-ae86-fce2a83da57b","Type":"ContainerStarted","Data":"64f5087d1fef5597940909a9502d86da012d769c33d094d3fa9841b1956d50fc"} Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.006104 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.006131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7495847675-nfrgn" event={"ID":"d2b144d4-c8fe-40b4-ae86-fce2a83da57b","Type":"ContainerStarted","Data":"cbb84129ad52b1a5fff48722531c4be04e1c23054391affffd3b2986862e6678"} Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.007668 4921 generic.go:334] "Generic (PLEG): container finished" podID="d02d7675-a4ea-46de-8334-3d035b85c05c" containerID="62b849949f940a3874e26e5f7b3d5ebd9b5f2ac08f839c1ec1399e9d6ef5c94f" exitCode=0 Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.007710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4s82-config-6m5z5" event={"ID":"d02d7675-a4ea-46de-8334-3d035b85c05c","Type":"ContainerDied","Data":"62b849949f940a3874e26e5f7b3d5ebd9b5f2ac08f839c1ec1399e9d6ef5c94f"} Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.032247 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7495847675-nfrgn" podStartSLOduration=2.933249447 podStartE2EDuration="13.032232477s" podCreationTimestamp="2026-03-18 13:49:33 +0000 UTC" firstStartedPulling="2026-03-18 13:49:34.127034852 +0000 UTC m=+5993.676955491" lastFinishedPulling="2026-03-18 13:49:44.226017882 +0000 UTC m=+6003.775938521" observedRunningTime="2026-03-18 13:49:46.022071536 +0000 UTC m=+6005.571992175" watchObservedRunningTime="2026-03-18 13:49:46.032232477 +0000 UTC m=+6005.582153116" Mar 18 13:49:46 crc kubenswrapper[4921]: I0318 13:49:46.351457 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-p4s82" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.017583 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.505790 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605535 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run\") pod \"d02d7675-a4ea-46de-8334-3d035b85c05c\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605604 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-scripts\") pod \"d02d7675-a4ea-46de-8334-3d035b85c05c\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605740 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-log-ovn\") pod \"d02d7675-a4ea-46de-8334-3d035b85c05c\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605773 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-additional-scripts\") pod \"d02d7675-a4ea-46de-8334-3d035b85c05c\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605804 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run-ovn\") pod \"d02d7675-a4ea-46de-8334-3d035b85c05c\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605780 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run" (OuterVolumeSpecName: "var-run") pod "d02d7675-a4ea-46de-8334-3d035b85c05c" (UID: "d02d7675-a4ea-46de-8334-3d035b85c05c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605836 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwv4p\" (UniqueName: \"kubernetes.io/projected/d02d7675-a4ea-46de-8334-3d035b85c05c-kube-api-access-vwv4p\") pod \"d02d7675-a4ea-46de-8334-3d035b85c05c\" (UID: \"d02d7675-a4ea-46de-8334-3d035b85c05c\") " Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.605877 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d02d7675-a4ea-46de-8334-3d035b85c05c" (UID: "d02d7675-a4ea-46de-8334-3d035b85c05c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.606259 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.606277 4921 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.607633 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d02d7675-a4ea-46de-8334-3d035b85c05c" (UID: "d02d7675-a4ea-46de-8334-3d035b85c05c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.607694 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-scripts" (OuterVolumeSpecName: "scripts") pod "d02d7675-a4ea-46de-8334-3d035b85c05c" (UID: "d02d7675-a4ea-46de-8334-3d035b85c05c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.607697 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d02d7675-a4ea-46de-8334-3d035b85c05c" (UID: "d02d7675-a4ea-46de-8334-3d035b85c05c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.613524 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02d7675-a4ea-46de-8334-3d035b85c05c-kube-api-access-vwv4p" (OuterVolumeSpecName: "kube-api-access-vwv4p") pod "d02d7675-a4ea-46de-8334-3d035b85c05c" (UID: "d02d7675-a4ea-46de-8334-3d035b85c05c"). InnerVolumeSpecName "kube-api-access-vwv4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.709230 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.709300 4921 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d02d7675-a4ea-46de-8334-3d035b85c05c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.709312 4921 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d02d7675-a4ea-46de-8334-3d035b85c05c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.709327 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwv4p\" (UniqueName: \"kubernetes.io/projected/d02d7675-a4ea-46de-8334-3d035b85c05c-kube-api-access-vwv4p\") on node \"crc\" DevicePath \"\"" Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.739301 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-p4s82-config-6m5z5"] Mar 18 13:49:47 crc kubenswrapper[4921]: I0318 13:49:47.752449 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-p4s82-config-6m5z5"] Mar 18 13:49:48 crc kubenswrapper[4921]: I0318 13:49:48.025868 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2df14e33275eacb65e9a6060bd44957d10ce260e57a6cb1cf9b20be77aae076f" Mar 18 13:49:48 crc kubenswrapper[4921]: I0318 13:49:48.025878 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4s82-config-6m5z5" Mar 18 13:49:49 crc kubenswrapper[4921]: I0318 13:49:49.240202 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02d7675-a4ea-46de-8334-3d035b85c05c" path="/var/lib/kubelet/pods/d02d7675-a4ea-46de-8334-3d035b85c05c/volumes" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.151758 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564030-8csrm"] Mar 18 13:50:00 crc kubenswrapper[4921]: E0318 13:50:00.153212 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02d7675-a4ea-46de-8334-3d035b85c05c" containerName="ovn-config" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.153233 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02d7675-a4ea-46de-8334-3d035b85c05c" containerName="ovn-config" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.153505 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02d7675-a4ea-46de-8334-3d035b85c05c" containerName="ovn-config" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.154266 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.157687 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.158023 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.159635 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.165884 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-8csrm"] Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.191618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rhnd\" (UniqueName: \"kubernetes.io/projected/0ad24e41-3dca-4d97-b126-6692b454bc28-kube-api-access-7rhnd\") pod \"auto-csr-approver-29564030-8csrm\" (UID: \"0ad24e41-3dca-4d97-b126-6692b454bc28\") " pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.293040 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rhnd\" (UniqueName: \"kubernetes.io/projected/0ad24e41-3dca-4d97-b126-6692b454bc28-kube-api-access-7rhnd\") pod \"auto-csr-approver-29564030-8csrm\" (UID: \"0ad24e41-3dca-4d97-b126-6692b454bc28\") " pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.323688 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rhnd\" (UniqueName: \"kubernetes.io/projected/0ad24e41-3dca-4d97-b126-6692b454bc28-kube-api-access-7rhnd\") pod \"auto-csr-approver-29564030-8csrm\" (UID: \"0ad24e41-3dca-4d97-b126-6692b454bc28\") " pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:00 crc kubenswrapper[4921]: I0318 13:50:00.483905 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:01 crc kubenswrapper[4921]: I0318 13:50:01.015405 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-8csrm"] Mar 18 13:50:01 crc kubenswrapper[4921]: I0318 13:50:01.168334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-8csrm" event={"ID":"0ad24e41-3dca-4d97-b126-6692b454bc28","Type":"ContainerStarted","Data":"0388bcb5bdc53595acdf908be87c1ecceb5e638b8d55333b4902f13e4d8beb05"} Mar 18 13:50:03 crc kubenswrapper[4921]: I0318 13:50:03.186805 4921 generic.go:334] "Generic (PLEG): container finished" podID="0ad24e41-3dca-4d97-b126-6692b454bc28" containerID="7a854a2db0099288bc2fb4296cb843e15edd5f18e2703086f2befc7e7b0bf2ce" exitCode=0 Mar 18 13:50:03 crc kubenswrapper[4921]: I0318 13:50:03.186882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-8csrm" event={"ID":"0ad24e41-3dca-4d97-b126-6692b454bc28","Type":"ContainerDied","Data":"7a854a2db0099288bc2fb4296cb843e15edd5f18e2703086f2befc7e7b0bf2ce"} Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.061727 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-4fw67"] Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.063739 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.067735 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.067787 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.067975 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.077093 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-4fw67"] Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.178442 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c7adf497-5641-422f-a7c3-cdeb5f8741bc-hm-ports\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.178488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7adf497-5641-422f-a7c3-cdeb5f8741bc-config-data-merged\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.178803 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adf497-5641-422f-a7c3-cdeb5f8741bc-config-data\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.178850 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adf497-5641-422f-a7c3-cdeb5f8741bc-scripts\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.281549 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adf497-5641-422f-a7c3-cdeb5f8741bc-config-data\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.281896 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adf497-5641-422f-a7c3-cdeb5f8741bc-scripts\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.281974 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c7adf497-5641-422f-a7c3-cdeb5f8741bc-hm-ports\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.281994 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7adf497-5641-422f-a7c3-cdeb5f8741bc-config-data-merged\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.282559 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7adf497-5641-422f-a7c3-cdeb5f8741bc-config-data-merged\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.283076 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c7adf497-5641-422f-a7c3-cdeb5f8741bc-hm-ports\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.287793 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7adf497-5641-422f-a7c3-cdeb5f8741bc-config-data\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.288916 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7adf497-5641-422f-a7c3-cdeb5f8741bc-scripts\") pod \"octavia-rsyslog-4fw67\" (UID: \"c7adf497-5641-422f-a7c3-cdeb5f8741bc\") " pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.385956 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.537559 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.693638 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rhnd\" (UniqueName: \"kubernetes.io/projected/0ad24e41-3dca-4d97-b126-6692b454bc28-kube-api-access-7rhnd\") pod \"0ad24e41-3dca-4d97-b126-6692b454bc28\" (UID: \"0ad24e41-3dca-4d97-b126-6692b454bc28\") " Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.708769 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad24e41-3dca-4d97-b126-6692b454bc28-kube-api-access-7rhnd" (OuterVolumeSpecName: "kube-api-access-7rhnd") pod "0ad24e41-3dca-4d97-b126-6692b454bc28" (UID: "0ad24e41-3dca-4d97-b126-6692b454bc28"). InnerVolumeSpecName "kube-api-access-7rhnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.757216 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njbb9"] Mar 18 13:50:04 crc kubenswrapper[4921]: E0318 13:50:04.757616 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad24e41-3dca-4d97-b126-6692b454bc28" containerName="oc" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.757632 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad24e41-3dca-4d97-b126-6692b454bc28" containerName="oc" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.757802 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad24e41-3dca-4d97-b126-6692b454bc28" containerName="oc" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.761816 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.766012 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.784218 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njbb9"] Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.796902 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rhnd\" (UniqueName: \"kubernetes.io/projected/0ad24e41-3dca-4d97-b126-6692b454bc28-kube-api-access-7rhnd\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.899157 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2ebe59ac-d522-44c5-88ce-e6814bfae3af-amphora-image\") pod \"octavia-image-upload-59f8cff499-njbb9\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.899491 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ebe59ac-d522-44c5-88ce-e6814bfae3af-httpd-config\") pod \"octavia-image-upload-59f8cff499-njbb9\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:04 crc kubenswrapper[4921]: I0318 13:50:04.967601 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-4fw67"] Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.001605 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ebe59ac-d522-44c5-88ce-e6814bfae3af-httpd-config\") pod \"octavia-image-upload-59f8cff499-njbb9\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.001723 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2ebe59ac-d522-44c5-88ce-e6814bfae3af-amphora-image\") pod \"octavia-image-upload-59f8cff499-njbb9\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.002486 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2ebe59ac-d522-44c5-88ce-e6814bfae3af-amphora-image\") pod \"octavia-image-upload-59f8cff499-njbb9\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.006473 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ebe59ac-d522-44c5-88ce-e6814bfae3af-httpd-config\") pod \"octavia-image-upload-59f8cff499-njbb9\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.087370 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.163861 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-4fw67"] Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.254875 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-8csrm" Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.256282 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4fw67" event={"ID":"c7adf497-5641-422f-a7c3-cdeb5f8741bc","Type":"ContainerStarted","Data":"bfe320ecf54206945144c31f89f49b78509b710527124e4c3959812f62b01bee"} Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.256331 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-8csrm" event={"ID":"0ad24e41-3dca-4d97-b126-6692b454bc28","Type":"ContainerDied","Data":"0388bcb5bdc53595acdf908be87c1ecceb5e638b8d55333b4902f13e4d8beb05"} Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.256349 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0388bcb5bdc53595acdf908be87c1ecceb5e638b8d55333b4902f13e4d8beb05" Mar 18 13:50:05 crc kubenswrapper[4921]: W0318 13:50:05.601693 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe59ac_d522_44c5_88ce_e6814bfae3af.slice/crio-5e05ffedca274717d6fa055e9b721255926719d48864864e35473a624b717a6d WatchSource:0}: Error finding container 5e05ffedca274717d6fa055e9b721255926719d48864864e35473a624b717a6d: Status 404 returned error can't find the container with id 5e05ffedca274717d6fa055e9b721255926719d48864864e35473a624b717a6d Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.618901 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njbb9"] Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.627866 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-hps95"] Mar 18 13:50:05 crc kubenswrapper[4921]: I0318 13:50:05.638286 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-hps95"] Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.268125 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njbb9" event={"ID":"2ebe59ac-d522-44c5-88ce-e6814bfae3af","Type":"ContainerStarted","Data":"5e05ffedca274717d6fa055e9b721255926719d48864864e35473a624b717a6d"} Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.412076 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-g72k9"] Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.419022 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.425688 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.432592 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-g72k9"] Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.456153 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-scripts\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.456234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.456340 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-combined-ca-bundle\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.456407 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.561517 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.561610 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-combined-ca-bundle\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.561666 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.561749 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-scripts\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.564870 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.569715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-combined-ca-bundle\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.572202 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-scripts\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.572882 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data\") pod \"octavia-db-sync-g72k9\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:06 crc kubenswrapper[4921]: I0318 13:50:06.756769 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:07 crc kubenswrapper[4921]: I0318 13:50:07.227933 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fac776-8d23-4400-8243-3ef48f80b898" path="/var/lib/kubelet/pods/68fac776-8d23-4400-8243-3ef48f80b898/volumes" Mar 18 13:50:07 crc kubenswrapper[4921]: I0318 13:50:07.327604 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-g72k9"] Mar 18 13:50:07 crc kubenswrapper[4921]: W0318 13:50:07.697656 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb31fa49e_6ecc_4079_9f4b_76b05cba5fea.slice/crio-8944d7f7b3d8b97a436859966e632bb9e7af7978ec47b8e2c26c5221e2a6a534 WatchSource:0}: Error finding container 8944d7f7b3d8b97a436859966e632bb9e7af7978ec47b8e2c26c5221e2a6a534: Status 404 returned error can't find the container with id 8944d7f7b3d8b97a436859966e632bb9e7af7978ec47b8e2c26c5221e2a6a534 Mar 18 13:50:08 crc kubenswrapper[4921]: I0318 13:50:08.311245 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g72k9" event={"ID":"b31fa49e-6ecc-4079-9f4b-76b05cba5fea","Type":"ContainerStarted","Data":"8944d7f7b3d8b97a436859966e632bb9e7af7978ec47b8e2c26c5221e2a6a534"} Mar 18 13:50:08 crc kubenswrapper[4921]: I0318 13:50:08.313649 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4fw67" event={"ID":"c7adf497-5641-422f-a7c3-cdeb5f8741bc","Type":"ContainerStarted","Data":"3be7fa3cbd5b6cac7ef9dabcb1031ff6001fd1d53d8f5deda39076cafdfc0ea8"} Mar 18 13:50:09 crc kubenswrapper[4921]: I0318 13:50:09.416384 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:50:09 crc kubenswrapper[4921]: I0318 13:50:09.418991 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7495847675-nfrgn" Mar 18 13:50:09 crc kubenswrapper[4921]: I0318 13:50:09.645835 4921 scope.go:117] "RemoveContainer" containerID="3f3776e8291f235fd28f10011807d0e6f79a195e770d2e905b94be9ddf9e6b09" Mar 18 13:50:09 crc kubenswrapper[4921]: I0318 13:50:09.767785 4921 scope.go:117] "RemoveContainer" containerID="7ab6d7e4c5f1e8d4501bc9049340b69dae819d320fc22a3125339511646e97c2" Mar 18 13:50:10 crc kubenswrapper[4921]: I0318 13:50:10.338188 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g72k9" event={"ID":"b31fa49e-6ecc-4079-9f4b-76b05cba5fea","Type":"ContainerStarted","Data":"e45d64d1c687b56dd7cc7d058090faad50e017927dbc67ab286d6d3bc3d107b9"} Mar 18 13:50:10 crc kubenswrapper[4921]: I0318 13:50:10.732965 4921 scope.go:117] "RemoveContainer" containerID="c6fb3d072a73a5df06123b12eef9cb41babe6480784877a459075862621412b7" Mar 18 13:50:11 crc kubenswrapper[4921]: I0318 13:50:11.357639 4921 generic.go:334] "Generic (PLEG): container finished" podID="c7adf497-5641-422f-a7c3-cdeb5f8741bc" containerID="3be7fa3cbd5b6cac7ef9dabcb1031ff6001fd1d53d8f5deda39076cafdfc0ea8" exitCode=0 Mar 18 13:50:11 crc kubenswrapper[4921]: I0318 13:50:11.357757 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4fw67" event={"ID":"c7adf497-5641-422f-a7c3-cdeb5f8741bc","Type":"ContainerDied","Data":"3be7fa3cbd5b6cac7ef9dabcb1031ff6001fd1d53d8f5deda39076cafdfc0ea8"} Mar 18 13:50:11 crc kubenswrapper[4921]: I0318 13:50:11.363237 4921 generic.go:334] "Generic (PLEG): container finished" podID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerID="e45d64d1c687b56dd7cc7d058090faad50e017927dbc67ab286d6d3bc3d107b9" exitCode=0 Mar 18 13:50:11 crc kubenswrapper[4921]: I0318 13:50:11.363340 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g72k9" event={"ID":"b31fa49e-6ecc-4079-9f4b-76b05cba5fea","Type":"ContainerDied","Data":"e45d64d1c687b56dd7cc7d058090faad50e017927dbc67ab286d6d3bc3d107b9"} Mar 18 13:50:12 crc kubenswrapper[4921]: I0318 13:50:12.395494 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g72k9" event={"ID":"b31fa49e-6ecc-4079-9f4b-76b05cba5fea","Type":"ContainerStarted","Data":"30f444677839be1824cf2d89174630b2e3e37dde5d69b24fba8e8aa5432a456c"} Mar 18 13:50:15 crc kubenswrapper[4921]: I0318 13:50:15.429977 4921 generic.go:334] "Generic (PLEG): container finished" podID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerID="30f444677839be1824cf2d89174630b2e3e37dde5d69b24fba8e8aa5432a456c" exitCode=0 Mar 18 13:50:15 crc kubenswrapper[4921]: I0318 13:50:15.430082 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g72k9" event={"ID":"b31fa49e-6ecc-4079-9f4b-76b05cba5fea","Type":"ContainerDied","Data":"30f444677839be1824cf2d89174630b2e3e37dde5d69b24fba8e8aa5432a456c"} Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.159253 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.257167 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-combined-ca-bundle\") pod \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.257223 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data\") pod \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.257403 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged\") pod \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.257476 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-scripts\") pod \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.262325 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data" (OuterVolumeSpecName: "config-data") pod "b31fa49e-6ecc-4079-9f4b-76b05cba5fea" (UID: "b31fa49e-6ecc-4079-9f4b-76b05cba5fea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.264786 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-scripts" (OuterVolumeSpecName: "scripts") pod "b31fa49e-6ecc-4079-9f4b-76b05cba5fea" (UID: "b31fa49e-6ecc-4079-9f4b-76b05cba5fea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:19 crc kubenswrapper[4921]: E0318 13:50:19.280637 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged podName:b31fa49e-6ecc-4079-9f4b-76b05cba5fea nodeName:}" failed. No retries permitted until 2026-03-18 13:50:19.780612195 +0000 UTC m=+6039.330532834 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data-merged" (UniqueName: "kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged") pod "b31fa49e-6ecc-4079-9f4b-76b05cba5fea" (UID: "b31fa49e-6ecc-4079-9f4b-76b05cba5fea") : error deleting /var/lib/kubelet/pods/b31fa49e-6ecc-4079-9f4b-76b05cba5fea/volume-subpaths: remove /var/lib/kubelet/pods/b31fa49e-6ecc-4079-9f4b-76b05cba5fea/volume-subpaths: no such file or directory Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.282975 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b31fa49e-6ecc-4079-9f4b-76b05cba5fea" (UID: "b31fa49e-6ecc-4079-9f4b-76b05cba5fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.359998 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.360046 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.360064 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.488665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-g72k9" event={"ID":"b31fa49e-6ecc-4079-9f4b-76b05cba5fea","Type":"ContainerDied","Data":"8944d7f7b3d8b97a436859966e632bb9e7af7978ec47b8e2c26c5221e2a6a534"} Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.488712 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8944d7f7b3d8b97a436859966e632bb9e7af7978ec47b8e2c26c5221e2a6a534" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.488717 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-g72k9" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.868688 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged\") pod \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\" (UID: \"b31fa49e-6ecc-4079-9f4b-76b05cba5fea\") " Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.869266 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "b31fa49e-6ecc-4079-9f4b-76b05cba5fea" (UID: "b31fa49e-6ecc-4079-9f4b-76b05cba5fea"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:19 crc kubenswrapper[4921]: I0318 13:50:19.869625 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b31fa49e-6ecc-4079-9f4b-76b05cba5fea-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:20 crc kubenswrapper[4921]: I0318 13:50:20.500276 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerID="4556ba35566cd16bc3e9089cfce9d705a6cdbc67a789dcc88db9287d062c897b" exitCode=0 Mar 18 13:50:20 crc kubenswrapper[4921]: I0318 13:50:20.500386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njbb9" event={"ID":"2ebe59ac-d522-44c5-88ce-e6814bfae3af","Type":"ContainerDied","Data":"4556ba35566cd16bc3e9089cfce9d705a6cdbc67a789dcc88db9287d062c897b"} Mar 18 13:50:20 crc kubenswrapper[4921]: I0318 13:50:20.504198 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4fw67" event={"ID":"c7adf497-5641-422f-a7c3-cdeb5f8741bc","Type":"ContainerStarted","Data":"7ab6a28a160ed934f482c581eff996d283dd8afbefac9e0e982a8ca491470400"} Mar 18 13:50:20 crc kubenswrapper[4921]: I0318 13:50:20.504398 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:20 crc kubenswrapper[4921]: I0318 13:50:20.552642 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-4fw67" podStartSLOduration=2.32477476 podStartE2EDuration="16.552610427s" podCreationTimestamp="2026-03-18 13:50:04 +0000 UTC" firstStartedPulling="2026-03-18 13:50:04.978586761 +0000 UTC m=+6024.528507400" lastFinishedPulling="2026-03-18 13:50:19.206422428 +0000 UTC m=+6038.756343067" observedRunningTime="2026-03-18 13:50:20.538267035 +0000 UTC m=+6040.088187674" watchObservedRunningTime="2026-03-18 13:50:20.552610427 +0000 UTC m=+6040.102531066" Mar 18 13:50:22 crc kubenswrapper[4921]: I0318 13:50:22.540979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njbb9" event={"ID":"2ebe59ac-d522-44c5-88ce-e6814bfae3af","Type":"ContainerStarted","Data":"59247b393de7d8e73cf98fe0ed7ac2c7d42cfed7291ccdc8306ce7991128b10e"} Mar 18 13:50:22 crc kubenswrapper[4921]: I0318 13:50:22.567130 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-njbb9" podStartSLOduration=2.075794182 podStartE2EDuration="18.567089051s" podCreationTimestamp="2026-03-18 13:50:04 +0000 UTC" firstStartedPulling="2026-03-18 13:50:05.605845351 +0000 UTC m=+6025.155765990" lastFinishedPulling="2026-03-18 13:50:22.09714021 +0000 UTC m=+6041.647060859" observedRunningTime="2026-03-18 13:50:22.556024563 +0000 UTC m=+6042.105945212" watchObservedRunningTime="2026-03-18 13:50:22.567089051 +0000 UTC m=+6042.117009690" Mar 18 13:50:34 crc kubenswrapper[4921]: I0318 13:50:34.424673 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-4fw67" Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.154912 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njbb9"] Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.155793 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-njbb9" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerName="octavia-amphora-httpd" containerID="cri-o://59247b393de7d8e73cf98fe0ed7ac2c7d42cfed7291ccdc8306ce7991128b10e" gracePeriod=30 Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.789036 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerID="59247b393de7d8e73cf98fe0ed7ac2c7d42cfed7291ccdc8306ce7991128b10e" exitCode=0 Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.789413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njbb9" event={"ID":"2ebe59ac-d522-44c5-88ce-e6814bfae3af","Type":"ContainerDied","Data":"59247b393de7d8e73cf98fe0ed7ac2c7d42cfed7291ccdc8306ce7991128b10e"} Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.789448 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-njbb9" event={"ID":"2ebe59ac-d522-44c5-88ce-e6814bfae3af","Type":"ContainerDied","Data":"5e05ffedca274717d6fa055e9b721255926719d48864864e35473a624b717a6d"} Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.789462 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e05ffedca274717d6fa055e9b721255926719d48864864e35473a624b717a6d" Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.790400 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.898468 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2ebe59ac-d522-44c5-88ce-e6814bfae3af-amphora-image\") pod \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.899029 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ebe59ac-d522-44c5-88ce-e6814bfae3af-httpd-config\") pod \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\" (UID: \"2ebe59ac-d522-44c5-88ce-e6814bfae3af\") " Mar 18 13:50:45 crc kubenswrapper[4921]: I0318 13:50:45.945823 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebe59ac-d522-44c5-88ce-e6814bfae3af-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2ebe59ac-d522-44c5-88ce-e6814bfae3af" (UID: "2ebe59ac-d522-44c5-88ce-e6814bfae3af"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:50:46 crc kubenswrapper[4921]: I0318 13:50:46.001418 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2ebe59ac-d522-44c5-88ce-e6814bfae3af-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:46 crc kubenswrapper[4921]: I0318 13:50:46.051366 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ebe59ac-d522-44c5-88ce-e6814bfae3af-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "2ebe59ac-d522-44c5-88ce-e6814bfae3af" (UID: "2ebe59ac-d522-44c5-88ce-e6814bfae3af"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:50:46 crc kubenswrapper[4921]: I0318 13:50:46.102672 4921 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2ebe59ac-d522-44c5-88ce-e6814bfae3af-amphora-image\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:46 crc kubenswrapper[4921]: I0318 13:50:46.799730 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-njbb9" Mar 18 13:50:46 crc kubenswrapper[4921]: I0318 13:50:46.847099 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njbb9"] Mar 18 13:50:46 crc kubenswrapper[4921]: I0318 13:50:46.861297 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-njbb9"] Mar 18 13:50:47 crc kubenswrapper[4921]: I0318 13:50:47.220185 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" path="/var/lib/kubelet/pods/2ebe59ac-d522-44c5-88ce-e6814bfae3af/volumes" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.433564 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qxrzz"] Mar 18 13:50:49 crc kubenswrapper[4921]: E0318 13:50:49.434341 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerName="octavia-amphora-httpd" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.434356 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerName="octavia-amphora-httpd" Mar 18 13:50:49 crc kubenswrapper[4921]: E0318 13:50:49.434363 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerName="init" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.434368 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerName="init" Mar 18 13:50:49 crc kubenswrapper[4921]: E0318 13:50:49.434397 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerName="octavia-db-sync" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.434403 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerName="octavia-db-sync" Mar 18 13:50:49 crc kubenswrapper[4921]: E0318 13:50:49.434419 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerName="init" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.434424 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerName="init" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.434578 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" containerName="octavia-db-sync" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.434599 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ebe59ac-d522-44c5-88ce-e6814bfae3af" containerName="octavia-amphora-httpd" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.435648 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.439005 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.445598 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qxrzz"] Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.486460 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/8352349b-7195-4927-9502-3a0f15b09da0-amphora-image\") pod \"octavia-image-upload-59f8cff499-qxrzz\" (UID: \"8352349b-7195-4927-9502-3a0f15b09da0\") " pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.486617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8352349b-7195-4927-9502-3a0f15b09da0-httpd-config\") pod \"octavia-image-upload-59f8cff499-qxrzz\" (UID: \"8352349b-7195-4927-9502-3a0f15b09da0\") " pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.589078 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/8352349b-7195-4927-9502-3a0f15b09da0-amphora-image\") pod \"octavia-image-upload-59f8cff499-qxrzz\" (UID: \"8352349b-7195-4927-9502-3a0f15b09da0\") " pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.589154 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8352349b-7195-4927-9502-3a0f15b09da0-httpd-config\") pod \"octavia-image-upload-59f8cff499-qxrzz\" (UID: \"8352349b-7195-4927-9502-3a0f15b09da0\") " pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.589703 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/8352349b-7195-4927-9502-3a0f15b09da0-amphora-image\") pod \"octavia-image-upload-59f8cff499-qxrzz\" (UID: \"8352349b-7195-4927-9502-3a0f15b09da0\") " pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.595608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8352349b-7195-4927-9502-3a0f15b09da0-httpd-config\") pod \"octavia-image-upload-59f8cff499-qxrzz\" (UID: \"8352349b-7195-4927-9502-3a0f15b09da0\") " pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:49 crc kubenswrapper[4921]: I0318 13:50:49.758176 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" Mar 18 13:50:50 crc kubenswrapper[4921]: I0318 13:50:50.241788 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qxrzz"] Mar 18 13:50:50 crc kubenswrapper[4921]: I0318 13:50:50.251057 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:50:50 crc kubenswrapper[4921]: I0318 13:50:50.843303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" event={"ID":"8352349b-7195-4927-9502-3a0f15b09da0","Type":"ContainerStarted","Data":"42dc52e917476889c1ce374c918549e5de211812afa47acfdf771a7f8f8fccf8"} Mar 18 13:50:51 crc kubenswrapper[4921]: I0318 13:50:51.854216 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" event={"ID":"8352349b-7195-4927-9502-3a0f15b09da0","Type":"ContainerStarted","Data":"f54530b0f86a8dce72aad03cc3a39389cbe3fb4d6b1b53f816d01aa02f2ffe39"} Mar 18 13:50:52 crc kubenswrapper[4921]: I0318 13:50:52.865919 4921 generic.go:334] "Generic (PLEG): container finished" podID="8352349b-7195-4927-9502-3a0f15b09da0" containerID="f54530b0f86a8dce72aad03cc3a39389cbe3fb4d6b1b53f816d01aa02f2ffe39" exitCode=0 Mar 18 13:50:52 crc kubenswrapper[4921]: I0318 13:50:52.865977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" event={"ID":"8352349b-7195-4927-9502-3a0f15b09da0","Type":"ContainerDied","Data":"f54530b0f86a8dce72aad03cc3a39389cbe3fb4d6b1b53f816d01aa02f2ffe39"} Mar 18 13:50:56 crc kubenswrapper[4921]: I0318 13:50:56.939858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" event={"ID":"8352349b-7195-4927-9502-3a0f15b09da0","Type":"ContainerStarted","Data":"b9b4aa64ad55287f66178bf2ed37dbdf1fb721cc8776e88563f7b516dfba5033"} Mar 18 13:50:56 crc kubenswrapper[4921]: I0318 13:50:56.954716 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-qxrzz" podStartSLOduration=2.509017951 podStartE2EDuration="7.954697941s" podCreationTimestamp="2026-03-18 13:50:49 +0000 UTC" firstStartedPulling="2026-03-18 13:50:50.250800764 +0000 UTC m=+6069.800721403" lastFinishedPulling="2026-03-18 13:50:55.696480754 +0000 UTC m=+6075.246401393" observedRunningTime="2026-03-18 13:50:56.953011182 +0000 UTC m=+6076.502931831" watchObservedRunningTime="2026-03-18 13:50:56.954697941 +0000 UTC m=+6076.504618580" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.858652 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-8xqfl"] Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.861068 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.863480 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.864432 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.864632 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.876801 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8xqfl"] Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.964322 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6a7589fc-80fa-4f87-988a-4d82d9a208c7-hm-ports\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.964427 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6a7589fc-80fa-4f87-988a-4d82d9a208c7-config-data-merged\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.964480 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-config-data\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.964567 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-combined-ca-bundle\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.964618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-scripts\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:08 crc kubenswrapper[4921]: I0318 13:51:08.964766 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-amphora-certs\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.066295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-amphora-certs\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.066399 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6a7589fc-80fa-4f87-988a-4d82d9a208c7-hm-ports\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.066459 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6a7589fc-80fa-4f87-988a-4d82d9a208c7-config-data-merged\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.066483 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-config-data\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.066523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-combined-ca-bundle\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.066574 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-scripts\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.067259 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6a7589fc-80fa-4f87-988a-4d82d9a208c7-config-data-merged\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.068005 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6a7589fc-80fa-4f87-988a-4d82d9a208c7-hm-ports\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.075167 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-scripts\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.076294 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-combined-ca-bundle\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.080362 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-config-data\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.080490 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6a7589fc-80fa-4f87-988a-4d82d9a208c7-amphora-certs\") pod \"octavia-healthmanager-8xqfl\" (UID: \"6a7589fc-80fa-4f87-988a-4d82d9a208c7\") " pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.187345 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:09 crc kubenswrapper[4921]: I0318 13:51:09.778324 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8xqfl"] Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.079159 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8xqfl" event={"ID":"6a7589fc-80fa-4f87-988a-4d82d9a208c7","Type":"ContainerStarted","Data":"3f110674bdf5b8bb14daff8feb2527d89ed4e8a789739a1c9b302af81be9ac58"} Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.543481 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-cwbms"] Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.545739 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.549146 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.549385 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.571220 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-cwbms"] Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.696751 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-config-data\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.696842 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-combined-ca-bundle\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.696940 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-amphora-certs\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.697013 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fb567059-27c8-4f13-a81b-375261ab860a-config-data-merged\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.697054 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-scripts\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.697076 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fb567059-27c8-4f13-a81b-375261ab860a-hm-ports\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799226 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fb567059-27c8-4f13-a81b-375261ab860a-config-data-merged\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-scripts\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799312 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fb567059-27c8-4f13-a81b-375261ab860a-hm-ports\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-config-data\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799408 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-combined-ca-bundle\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799478 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-amphora-certs\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.799752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fb567059-27c8-4f13-a81b-375261ab860a-config-data-merged\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.800253 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fb567059-27c8-4f13-a81b-375261ab860a-hm-ports\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.805882 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-amphora-certs\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.806954 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-combined-ca-bundle\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.807541 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-config-data\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.807694 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb567059-27c8-4f13-a81b-375261ab860a-scripts\") pod \"octavia-housekeeping-cwbms\" (UID: \"fb567059-27c8-4f13-a81b-375261ab860a\") " pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:10 crc kubenswrapper[4921]: I0318 13:51:10.872354 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.088909 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8xqfl" event={"ID":"6a7589fc-80fa-4f87-988a-4d82d9a208c7","Type":"ContainerStarted","Data":"07555ae4c69d8bb6c53652197b8b37ed5a71c367fac8ec8e127fa9c3d34282ba"} Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.464744 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-cwbms"] Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.642895 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-h76fx"] Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.644856 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.647810 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.648057 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.653097 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-h76fx"] Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.821338 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-config-data\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.821571 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-amphora-certs\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.821887 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3cefc2a0-92a2-454e-91e1-7285f024f7ac-hm-ports\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.821981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-combined-ca-bundle\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.822042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3cefc2a0-92a2-454e-91e1-7285f024f7ac-config-data-merged\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.822162 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-scripts\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.923418 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-amphora-certs\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.923520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3cefc2a0-92a2-454e-91e1-7285f024f7ac-hm-ports\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.923615 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-combined-ca-bundle\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.923661 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3cefc2a0-92a2-454e-91e1-7285f024f7ac-config-data-merged\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.923695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-scripts\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.923741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-config-data\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.924376 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3cefc2a0-92a2-454e-91e1-7285f024f7ac-config-data-merged\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.924818 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3cefc2a0-92a2-454e-91e1-7285f024f7ac-hm-ports\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.930152 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-combined-ca-bundle\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.932297 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-scripts\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.941849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-amphora-certs\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.944010 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cefc2a0-92a2-454e-91e1-7285f024f7ac-config-data\") pod \"octavia-worker-h76fx\" (UID: \"3cefc2a0-92a2-454e-91e1-7285f024f7ac\") " pod="openstack/octavia-worker-h76fx" Mar 18 13:51:11 crc kubenswrapper[4921]: I0318 13:51:11.972318 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-h76fx" Mar 18 13:51:12 crc kubenswrapper[4921]: I0318 13:51:12.114428 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cwbms" event={"ID":"fb567059-27c8-4f13-a81b-375261ab860a","Type":"ContainerStarted","Data":"afd07fe1b7e83f126e8a3e7cac8deb03372b32f3af8a4389cb6b70c06302dab7"} Mar 18 13:51:12 crc kubenswrapper[4921]: I0318 13:51:12.647780 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-h76fx"] Mar 18 13:51:13 crc kubenswrapper[4921]: I0318 13:51:13.126521 4921 generic.go:334] "Generic (PLEG): container finished" podID="6a7589fc-80fa-4f87-988a-4d82d9a208c7" containerID="07555ae4c69d8bb6c53652197b8b37ed5a71c367fac8ec8e127fa9c3d34282ba" exitCode=0 Mar 18 13:51:13 crc kubenswrapper[4921]: I0318 13:51:13.126603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8xqfl" event={"ID":"6a7589fc-80fa-4f87-988a-4d82d9a208c7","Type":"ContainerDied","Data":"07555ae4c69d8bb6c53652197b8b37ed5a71c367fac8ec8e127fa9c3d34282ba"} Mar 18 13:51:13 crc kubenswrapper[4921]: I0318 13:51:13.129194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-h76fx" event={"ID":"3cefc2a0-92a2-454e-91e1-7285f024f7ac","Type":"ContainerStarted","Data":"06510af91d1d0fda18a7a63a72be3f3d6b5b57e5c91c0ad52ade3c80195b22d2"} Mar 18 13:51:13 crc kubenswrapper[4921]: I0318 13:51:13.452862 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8xqfl"] Mar 18 13:51:16 crc kubenswrapper[4921]: I0318 13:51:16.156581 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8xqfl" event={"ID":"6a7589fc-80fa-4f87-988a-4d82d9a208c7","Type":"ContainerStarted","Data":"fd092db2754aecd8f21e09f50923e9dcef561c1be423bbf0b8fc41394d0a7be9"} Mar 18 13:51:16 crc kubenswrapper[4921]: I0318 13:51:16.156916 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:18 crc kubenswrapper[4921]: I0318 13:51:18.176033 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cwbms" event={"ID":"fb567059-27c8-4f13-a81b-375261ab860a","Type":"ContainerStarted","Data":"3cd0a8306cbd9ef9e3c8c582097e377150068aca92a680e2f4b1ed08f65b77ad"} Mar 18 13:51:18 crc kubenswrapper[4921]: I0318 13:51:18.234260 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-8xqfl" podStartSLOduration=10.234242143 podStartE2EDuration="10.234242143s" podCreationTimestamp="2026-03-18 13:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:51:16.177363062 +0000 UTC m=+6095.727283711" watchObservedRunningTime="2026-03-18 13:51:18.234242143 +0000 UTC m=+6097.784162782" Mar 18 13:51:19 crc kubenswrapper[4921]: I0318 13:51:19.209642 4921 generic.go:334] "Generic (PLEG): container finished" podID="fb567059-27c8-4f13-a81b-375261ab860a" containerID="3cd0a8306cbd9ef9e3c8c582097e377150068aca92a680e2f4b1ed08f65b77ad" exitCode=0 Mar 18 13:51:19 crc kubenswrapper[4921]: I0318 13:51:19.221367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cwbms" event={"ID":"fb567059-27c8-4f13-a81b-375261ab860a","Type":"ContainerDied","Data":"3cd0a8306cbd9ef9e3c8c582097e377150068aca92a680e2f4b1ed08f65b77ad"} Mar 18 13:51:20 crc kubenswrapper[4921]: I0318 13:51:20.221369 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cwbms" event={"ID":"fb567059-27c8-4f13-a81b-375261ab860a","Type":"ContainerStarted","Data":"2e22c54f2e827fa100f81cfcc0244b326e98491f521dcdeb315efbc33be24136"} Mar 18 13:51:20 crc kubenswrapper[4921]: I0318 13:51:20.222677 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:20 crc kubenswrapper[4921]: I0318 13:51:20.223441 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-h76fx" event={"ID":"3cefc2a0-92a2-454e-91e1-7285f024f7ac","Type":"ContainerStarted","Data":"07cfb0bd23814665e99b4143b5855b5ab030cea95112a8170ddeda3a0f27fd39"} Mar 18 13:51:20 crc kubenswrapper[4921]: I0318 13:51:20.243301 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-cwbms" podStartSLOduration=4.491051414 podStartE2EDuration="10.243282481s" podCreationTimestamp="2026-03-18 13:51:10 +0000 UTC" firstStartedPulling="2026-03-18 13:51:11.478269093 +0000 UTC m=+6091.028189732" lastFinishedPulling="2026-03-18 13:51:17.23050015 +0000 UTC m=+6096.780420799" observedRunningTime="2026-03-18 13:51:20.235239461 +0000 UTC m=+6099.785160110" watchObservedRunningTime="2026-03-18 13:51:20.243282481 +0000 UTC m=+6099.793203120" Mar 18 13:51:21 crc kubenswrapper[4921]: I0318 13:51:21.257869 4921 generic.go:334] "Generic (PLEG): container finished" podID="3cefc2a0-92a2-454e-91e1-7285f024f7ac" containerID="07cfb0bd23814665e99b4143b5855b5ab030cea95112a8170ddeda3a0f27fd39" exitCode=0 Mar 18 13:51:21 crc kubenswrapper[4921]: I0318 13:51:21.259680 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-h76fx" event={"ID":"3cefc2a0-92a2-454e-91e1-7285f024f7ac","Type":"ContainerDied","Data":"07cfb0bd23814665e99b4143b5855b5ab030cea95112a8170ddeda3a0f27fd39"} Mar 18 13:51:22 crc kubenswrapper[4921]: I0318 13:51:22.269481 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-h76fx" event={"ID":"3cefc2a0-92a2-454e-91e1-7285f024f7ac","Type":"ContainerStarted","Data":"16aa3d08c3a063da2cfb79668c7d5721d47d57d4eead7870a6e1b458039904f2"} Mar 18 13:51:22 crc kubenswrapper[4921]: I0318 13:51:22.269916 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-h76fx" Mar 18 13:51:22 crc kubenswrapper[4921]: I0318 13:51:22.300743 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-h76fx" podStartSLOduration=4.912980723 podStartE2EDuration="11.300719596s" podCreationTimestamp="2026-03-18 13:51:11 +0000 UTC" firstStartedPulling="2026-03-18 13:51:12.627559458 +0000 UTC m=+6092.177480097" lastFinishedPulling="2026-03-18 13:51:19.015298331 +0000 UTC m=+6098.565218970" observedRunningTime="2026-03-18 13:51:22.291408949 +0000 UTC m=+6101.841329588" watchObservedRunningTime="2026-03-18 13:51:22.300719596 +0000 UTC m=+6101.850640235" Mar 18 13:51:24 crc kubenswrapper[4921]: I0318 13:51:24.227052 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-8xqfl" Mar 18 13:51:25 crc kubenswrapper[4921]: I0318 13:51:25.902829 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-cwbms" Mar 18 13:51:27 crc kubenswrapper[4921]: I0318 13:51:27.015161 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-h76fx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.321189 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54d67cf5f5-pcjjx"] Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.325509 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.335992 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-l2phb" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.336029 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.336057 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.336267 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.355013 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d67cf5f5-pcjjx"] Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.374412 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.374709 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-log" containerID="cri-o://871ef9689fa23ce411276d8d8f20ff4f9d6d4c2aafc27282632e5e5cbdbb6692" gracePeriod=30 Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.375267 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-httpd" containerID="cri-o://3c577fa552ae3a4ed50a0e2913690b973ceab307e19a7f14017ba56effc08b15" gracePeriod=30 Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.454179 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c4dbdc547-wznj5"] Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.456941 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.516407 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81af2677-6026-4718-9cb2-298263fa78a8-horizon-secret-key\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.516510 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-config-data\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.516555 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-scripts\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.516753 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81af2677-6026-4718-9cb2-298263fa78a8-logs\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.516813 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/81af2677-6026-4718-9cb2-298263fa78a8-kube-api-access-t64wv\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.526254 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c4dbdc547-wznj5"] Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.544997 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.545310 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-log" containerID="cri-o://bfc018df9d6e84012230f326f54a78bc538ed65aa99536cd0dd2ebd1b2b84ed7" gracePeriod=30 Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.545794 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-httpd" containerID="cri-o://6ded7fcb9b153cbbdab97d24eb0a4079e14521cd7c324d07b4893694f02a2246" gracePeriod=30 Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/203cb28e-6e70-44c0-86f8-36a50a1d994f-logs\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624143 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/81af2677-6026-4718-9cb2-298263fa78a8-kube-api-access-t64wv\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-scripts\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624308 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81af2677-6026-4718-9cb2-298263fa78a8-horizon-secret-key\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624366 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-config-data\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-scripts\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624461 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtb9n\" (UniqueName: \"kubernetes.io/projected/203cb28e-6e70-44c0-86f8-36a50a1d994f-kube-api-access-rtb9n\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624512 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/203cb28e-6e70-44c0-86f8-36a50a1d994f-horizon-secret-key\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624566 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-config-data\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.624647 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81af2677-6026-4718-9cb2-298263fa78a8-logs\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.625273 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81af2677-6026-4718-9cb2-298263fa78a8-logs\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.625951 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-config-data\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.625951 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-scripts\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.631102 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81af2677-6026-4718-9cb2-298263fa78a8-horizon-secret-key\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.642171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/81af2677-6026-4718-9cb2-298263fa78a8-kube-api-access-t64wv\") pod \"horizon-54d67cf5f5-pcjjx\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.662528 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.727897 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/203cb28e-6e70-44c0-86f8-36a50a1d994f-logs\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.728194 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-scripts\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.728576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/203cb28e-6e70-44c0-86f8-36a50a1d994f-logs\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.728628 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtb9n\" (UniqueName: \"kubernetes.io/projected/203cb28e-6e70-44c0-86f8-36a50a1d994f-kube-api-access-rtb9n\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.728720 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/203cb28e-6e70-44c0-86f8-36a50a1d994f-horizon-secret-key\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.728812 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-config-data\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.729477 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-scripts\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.734686 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/203cb28e-6e70-44c0-86f8-36a50a1d994f-horizon-secret-key\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.737870 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-config-data\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.752202 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtb9n\" (UniqueName: \"kubernetes.io/projected/203cb28e-6e70-44c0-86f8-36a50a1d994f-kube-api-access-rtb9n\") pod \"horizon-7c4dbdc547-wznj5\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:35 crc kubenswrapper[4921]: I0318 13:51:35.794366 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.072942 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54d67cf5f5-pcjjx"] Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.134348 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79659f955c-rqc2t"] Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.136720 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.163516 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79659f955c-rqc2t"] Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.236408 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54d67cf5f5-pcjjx"] Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.240678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22516266-aa62-413f-bf35-5aa0add4af9e-horizon-secret-key\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.240740 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkqt\" (UniqueName: \"kubernetes.io/projected/22516266-aa62-413f-bf35-5aa0add4af9e-kube-api-access-nqkqt\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.240821 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-scripts\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.240866 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-config-data\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.241029 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22516266-aa62-413f-bf35-5aa0add4af9e-logs\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.343194 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-scripts\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.343247 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-config-data\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.343368 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22516266-aa62-413f-bf35-5aa0add4af9e-logs\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.343584 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22516266-aa62-413f-bf35-5aa0add4af9e-horizon-secret-key\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.343636 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkqt\" (UniqueName: \"kubernetes.io/projected/22516266-aa62-413f-bf35-5aa0add4af9e-kube-api-access-nqkqt\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.343915 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22516266-aa62-413f-bf35-5aa0add4af9e-logs\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.344244 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-scripts\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.348056 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-config-data\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.358882 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22516266-aa62-413f-bf35-5aa0add4af9e-horizon-secret-key\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.362050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkqt\" (UniqueName: \"kubernetes.io/projected/22516266-aa62-413f-bf35-5aa0add4af9e-kube-api-access-nqkqt\") pod \"horizon-79659f955c-rqc2t\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.421620 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c4dbdc547-wznj5"] Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.435236 4921 generic.go:334] "Generic (PLEG): container finished" podID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerID="bfc018df9d6e84012230f326f54a78bc538ed65aa99536cd0dd2ebd1b2b84ed7" exitCode=143 Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.435315 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ee9259b-c033-4560-a793-ee436c2bf5b8","Type":"ContainerDied","Data":"bfc018df9d6e84012230f326f54a78bc538ed65aa99536cd0dd2ebd1b2b84ed7"} Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.436561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4dbdc547-wznj5" event={"ID":"203cb28e-6e70-44c0-86f8-36a50a1d994f","Type":"ContainerStarted","Data":"0d51813714c097a79840d02fd1e95b559932f794af33d3765bfe636580750044"} Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.438472 4921 generic.go:334] "Generic (PLEG): container finished" podID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerID="871ef9689fa23ce411276d8d8f20ff4f9d6d4c2aafc27282632e5e5cbdbb6692" exitCode=143 Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.438547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b1eff78-0677-46af-aa7d-6426e107ca86","Type":"ContainerDied","Data":"871ef9689fa23ce411276d8d8f20ff4f9d6d4c2aafc27282632e5e5cbdbb6692"} Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.440039 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d67cf5f5-pcjjx" event={"ID":"81af2677-6026-4718-9cb2-298263fa78a8","Type":"ContainerStarted","Data":"bede1e66beb013abf46108e69c9e3f4c727459bb48b63b6b32011e66f78a369b"} Mar 18 13:51:36 crc kubenswrapper[4921]: I0318 13:51:36.469574 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:37 crc kubenswrapper[4921]: I0318 13:51:37.050408 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79659f955c-rqc2t"] Mar 18 13:51:37 crc kubenswrapper[4921]: W0318 13:51:37.062272 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22516266_aa62_413f_bf35_5aa0add4af9e.slice/crio-57185841c3db8165f6295045170c01c3c8b62dc47ddb4486d9b953321d4876f4 WatchSource:0}: Error finding container 57185841c3db8165f6295045170c01c3c8b62dc47ddb4486d9b953321d4876f4: Status 404 returned error can't find the container with id 57185841c3db8165f6295045170c01c3c8b62dc47ddb4486d9b953321d4876f4 Mar 18 13:51:37 crc kubenswrapper[4921]: I0318 13:51:37.454729 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79659f955c-rqc2t" event={"ID":"22516266-aa62-413f-bf35-5aa0add4af9e","Type":"ContainerStarted","Data":"57185841c3db8165f6295045170c01c3c8b62dc47ddb4486d9b953321d4876f4"} Mar 18 13:51:39 crc kubenswrapper[4921]: I0318 13:51:39.482200 4921 generic.go:334] "Generic (PLEG): container finished" podID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerID="6ded7fcb9b153cbbdab97d24eb0a4079e14521cd7c324d07b4893694f02a2246" exitCode=0 Mar 18 13:51:39 crc kubenswrapper[4921]: I0318 13:51:39.482410 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ee9259b-c033-4560-a793-ee436c2bf5b8","Type":"ContainerDied","Data":"6ded7fcb9b153cbbdab97d24eb0a4079e14521cd7c324d07b4893694f02a2246"} Mar 18 13:51:39 crc kubenswrapper[4921]: I0318 13:51:39.485577 4921 generic.go:334] "Generic (PLEG): container finished" podID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerID="3c577fa552ae3a4ed50a0e2913690b973ceab307e19a7f14017ba56effc08b15" exitCode=0 Mar 18 13:51:39 crc kubenswrapper[4921]: I0318 13:51:39.485624 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b1eff78-0677-46af-aa7d-6426e107ca86","Type":"ContainerDied","Data":"3c577fa552ae3a4ed50a0e2913690b973ceab307e19a7f14017ba56effc08b15"} Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.500203 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.509317 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.587632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ee9259b-c033-4560-a793-ee436c2bf5b8","Type":"ContainerDied","Data":"5f1413b7181d5c023702ce414a05cfb13f8fb6bd8866bd807ff752ac3dd75f1a"} Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.587704 4921 scope.go:117] "RemoveContainer" containerID="6ded7fcb9b153cbbdab97d24eb0a4079e14521cd7c324d07b4893694f02a2246" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.587661 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.595168 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b1eff78-0677-46af-aa7d-6426e107ca86","Type":"ContainerDied","Data":"2bc6933ae93eedcd664aa389df462e212b027f10c50ad6d0c9c1ce5d83032267"} Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.595331 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.667593 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-scripts\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668023 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-scripts\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668161 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-config-data\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668324 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-logs\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668423 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-httpd-run\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668542 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-combined-ca-bundle\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668614 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-combined-ca-bundle\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668703 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-logs\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-config-data\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668880 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv7m9\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-kube-api-access-tv7m9\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.668969 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-httpd-run\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.669148 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scm88\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-kube-api-access-scm88\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.669257 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-ceph\") pod \"6b1eff78-0677-46af-aa7d-6426e107ca86\" (UID: \"6b1eff78-0677-46af-aa7d-6426e107ca86\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.669349 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-ceph\") pod \"5ee9259b-c033-4560-a793-ee436c2bf5b8\" (UID: \"5ee9259b-c033-4560-a793-ee436c2bf5b8\") " Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.669609 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-logs" (OuterVolumeSpecName: "logs") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.669850 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.670155 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.670248 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.670411 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-logs" (OuterVolumeSpecName: "logs") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.670961 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.674745 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-scripts" (OuterVolumeSpecName: "scripts") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.676249 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-scripts" (OuterVolumeSpecName: "scripts") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.680285 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-kube-api-access-tv7m9" (OuterVolumeSpecName: "kube-api-access-tv7m9") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "kube-api-access-tv7m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.680634 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-ceph" (OuterVolumeSpecName: "ceph") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.682322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-ceph" (OuterVolumeSpecName: "ceph") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.682867 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-kube-api-access-scm88" (OuterVolumeSpecName: "kube-api-access-scm88") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "kube-api-access-scm88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.731034 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.754248 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.766903 4921 scope.go:117] "RemoveContainer" containerID="bfc018df9d6e84012230f326f54a78bc538ed65aa99536cd0dd2ebd1b2b84ed7" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.770656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-config-data" (OuterVolumeSpecName: "config-data") pod "6b1eff78-0677-46af-aa7d-6426e107ca86" (UID: "6b1eff78-0677-46af-aa7d-6426e107ca86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772881 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scm88\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-kube-api-access-scm88\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772905 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772916 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ee9259b-c033-4560-a793-ee436c2bf5b8-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772927 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772936 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772948 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee9259b-c033-4560-a793-ee436c2bf5b8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772957 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772967 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772977 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1eff78-0677-46af-aa7d-6426e107ca86-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772987 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv7m9\" (UniqueName: \"kubernetes.io/projected/6b1eff78-0677-46af-aa7d-6426e107ca86-kube-api-access-tv7m9\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.772996 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b1eff78-0677-46af-aa7d-6426e107ca86-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.814397 4921 scope.go:117] "RemoveContainer" containerID="3c577fa552ae3a4ed50a0e2913690b973ceab307e19a7f14017ba56effc08b15" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.834991 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-config-data" (OuterVolumeSpecName: "config-data") pod "5ee9259b-c033-4560-a793-ee436c2bf5b8" (UID: "5ee9259b-c033-4560-a793-ee436c2bf5b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.849413 4921 scope.go:117] "RemoveContainer" containerID="871ef9689fa23ce411276d8d8f20ff4f9d6d4c2aafc27282632e5e5cbdbb6692" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.873939 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee9259b-c033-4560-a793-ee436c2bf5b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.941856 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:51:44 crc kubenswrapper[4921]: I0318 13:51:44.969410 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.000698 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.011670 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.027649 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: E0318 13:51:45.031716 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-log" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.031756 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-log" Mar 18 13:51:45 crc kubenswrapper[4921]: E0318 13:51:45.031777 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-log" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.031786 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-log" Mar 18 13:51:45 crc kubenswrapper[4921]: E0318 13:51:45.031820 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-httpd" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.031829 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-httpd" Mar 18 13:51:45 crc kubenswrapper[4921]: E0318 13:51:45.031841 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-httpd" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.031849 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-httpd" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.032154 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-log" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.032180 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-log" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.032196 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" containerName="glance-httpd" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.032218 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" containerName="glance-httpd" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.033615 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.035738 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.035825 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.036040 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g2xns" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.042810 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.060173 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.063334 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.073489 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080178 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b42ed4c-386d-4808-924a-2595f4e8c98a-logs\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080231 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080282 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b42ed4c-386d-4808-924a-2595f4e8c98a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080303 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080325 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/733aa67f-6cfa-4a18-bc24-44e770159808-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080355 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080393 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6gw\" (UniqueName: \"kubernetes.io/projected/733aa67f-6cfa-4a18-bc24-44e770159808-kube-api-access-xl6gw\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733aa67f-6cfa-4a18-bc24-44e770159808-logs\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080431 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kk7v\" (UniqueName: \"kubernetes.io/projected/0b42ed4c-386d-4808-924a-2595f4e8c98a-kube-api-access-5kk7v\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080447 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-config-data\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080512 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0b42ed4c-386d-4808-924a-2595f4e8c98a-ceph\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080540 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-scripts\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.080569 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/733aa67f-6cfa-4a18-bc24-44e770159808-ceph\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.089169 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182514 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0b42ed4c-386d-4808-924a-2595f4e8c98a-ceph\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182571 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-scripts\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182605 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/733aa67f-6cfa-4a18-bc24-44e770159808-ceph\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182633 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b42ed4c-386d-4808-924a-2595f4e8c98a-logs\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182671 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182705 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182723 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b42ed4c-386d-4808-924a-2595f4e8c98a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182762 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/733aa67f-6cfa-4a18-bc24-44e770159808-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182859 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6gw\" (UniqueName: \"kubernetes.io/projected/733aa67f-6cfa-4a18-bc24-44e770159808-kube-api-access-xl6gw\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182890 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733aa67f-6cfa-4a18-bc24-44e770159808-logs\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182911 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kk7v\" (UniqueName: \"kubernetes.io/projected/0b42ed4c-386d-4808-924a-2595f4e8c98a-kube-api-access-5kk7v\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.182928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-config-data\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.185486 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b42ed4c-386d-4808-924a-2595f4e8c98a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.186129 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b42ed4c-386d-4808-924a-2595f4e8c98a-logs\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.187972 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/733aa67f-6cfa-4a18-bc24-44e770159808-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.188236 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/733aa67f-6cfa-4a18-bc24-44e770159808-logs\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.188352 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-config-data\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.192734 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.193761 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/733aa67f-6cfa-4a18-bc24-44e770159808-ceph\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.197798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0b42ed4c-386d-4808-924a-2595f4e8c98a-ceph\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.198184 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.198346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.198539 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b42ed4c-386d-4808-924a-2595f4e8c98a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.204130 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/733aa67f-6cfa-4a18-bc24-44e770159808-scripts\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.205889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6gw\" (UniqueName: \"kubernetes.io/projected/733aa67f-6cfa-4a18-bc24-44e770159808-kube-api-access-xl6gw\") pod \"glance-default-internal-api-0\" (UID: \"733aa67f-6cfa-4a18-bc24-44e770159808\") " pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.217397 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kk7v\" (UniqueName: \"kubernetes.io/projected/0b42ed4c-386d-4808-924a-2595f4e8c98a-kube-api-access-5kk7v\") pod \"glance-default-external-api-0\" (UID: \"0b42ed4c-386d-4808-924a-2595f4e8c98a\") " pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.226481 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee9259b-c033-4560-a793-ee436c2bf5b8" path="/var/lib/kubelet/pods/5ee9259b-c033-4560-a793-ee436c2bf5b8/volumes" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.227122 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1eff78-0677-46af-aa7d-6426e107ca86" path="/var/lib/kubelet/pods/6b1eff78-0677-46af-aa7d-6426e107ca86/volumes" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.369835 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.384919 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.636129 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79659f955c-rqc2t" event={"ID":"22516266-aa62-413f-bf35-5aa0add4af9e","Type":"ContainerStarted","Data":"c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4"} Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.636760 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79659f955c-rqc2t" event={"ID":"22516266-aa62-413f-bf35-5aa0add4af9e","Type":"ContainerStarted","Data":"363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f"} Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.643432 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4dbdc547-wznj5" event={"ID":"203cb28e-6e70-44c0-86f8-36a50a1d994f","Type":"ContainerStarted","Data":"d9733068c0dea3979d23a04bee8869af33bdeee19cb08edaf3aebff7f36168bf"} Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.643475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4dbdc547-wznj5" event={"ID":"203cb28e-6e70-44c0-86f8-36a50a1d994f","Type":"ContainerStarted","Data":"8e9047c26a98572c655cc01cf8ca23483bc28f8fed0278e623abff69f97d46a2"} Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.664063 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d67cf5f5-pcjjx" event={"ID":"81af2677-6026-4718-9cb2-298263fa78a8","Type":"ContainerStarted","Data":"03f807a3e7aca3cbb51d782e8215ec1dec1e8843a234a4379d83008204a1f311"} Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.664135 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d67cf5f5-pcjjx" event={"ID":"81af2677-6026-4718-9cb2-298263fa78a8","Type":"ContainerStarted","Data":"d3ebcd79adb08594cbf1d98c32eebef6d55dacf4433a9918a1102827284e2fab"} Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.664322 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54d67cf5f5-pcjjx" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon-log" containerID="cri-o://d3ebcd79adb08594cbf1d98c32eebef6d55dacf4433a9918a1102827284e2fab" gracePeriod=30 Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.664586 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54d67cf5f5-pcjjx" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon" containerID="cri-o://03f807a3e7aca3cbb51d782e8215ec1dec1e8843a234a4379d83008204a1f311" gracePeriod=30 Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.664593 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.674900 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79659f955c-rqc2t" podStartSLOduration=2.237440158 podStartE2EDuration="9.674842009s" podCreationTimestamp="2026-03-18 13:51:36 +0000 UTC" firstStartedPulling="2026-03-18 13:51:37.082024448 +0000 UTC m=+6116.631945087" lastFinishedPulling="2026-03-18 13:51:44.519426299 +0000 UTC m=+6124.069346938" observedRunningTime="2026-03-18 13:51:45.659636643 +0000 UTC m=+6125.209557282" watchObservedRunningTime="2026-03-18 13:51:45.674842009 +0000 UTC m=+6125.224762638" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.699331 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c4dbdc547-wznj5" podStartSLOduration=2.650278127 podStartE2EDuration="10.69930644s" podCreationTimestamp="2026-03-18 13:51:35 +0000 UTC" firstStartedPulling="2026-03-18 13:51:36.425252952 +0000 UTC m=+6115.975173591" lastFinishedPulling="2026-03-18 13:51:44.474281255 +0000 UTC m=+6124.024201904" observedRunningTime="2026-03-18 13:51:45.683808186 +0000 UTC m=+6125.233728825" watchObservedRunningTime="2026-03-18 13:51:45.69930644 +0000 UTC m=+6125.249227079" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.712501 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54d67cf5f5-pcjjx" podStartSLOduration=2.497705894 podStartE2EDuration="10.712432686s" podCreationTimestamp="2026-03-18 13:51:35 +0000 UTC" firstStartedPulling="2026-03-18 13:51:36.222925283 +0000 UTC m=+6115.772845922" lastFinishedPulling="2026-03-18 13:51:44.437652075 +0000 UTC m=+6123.987572714" observedRunningTime="2026-03-18 13:51:45.704810798 +0000 UTC m=+6125.254731457" watchObservedRunningTime="2026-03-18 13:51:45.712432686 +0000 UTC m=+6125.262353325" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.794967 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:45 crc kubenswrapper[4921]: I0318 13:51:45.795018 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:51:46 crc kubenswrapper[4921]: I0318 13:51:46.010884 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 13:51:46 crc kubenswrapper[4921]: I0318 13:51:46.198983 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 13:51:46 crc kubenswrapper[4921]: I0318 13:51:46.475271 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:46 crc kubenswrapper[4921]: I0318 13:51:46.475596 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:51:46 crc kubenswrapper[4921]: I0318 13:51:46.719084 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"733aa67f-6cfa-4a18-bc24-44e770159808","Type":"ContainerStarted","Data":"45d6abd0205953f29e9e7f6fbd206df5ebe19b1150155db734d86450d293aeab"} Mar 18 13:51:46 crc kubenswrapper[4921]: I0318 13:51:46.721818 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b42ed4c-386d-4808-924a-2595f4e8c98a","Type":"ContainerStarted","Data":"3bc9796b5f983594b652b3c6b52f391d81494d365c9beb88dfaeddfda64d4453"} Mar 18 13:51:47 crc kubenswrapper[4921]: I0318 13:51:47.082167 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:51:47 crc kubenswrapper[4921]: I0318 13:51:47.082467 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:51:47 crc kubenswrapper[4921]: I0318 13:51:47.756322 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b42ed4c-386d-4808-924a-2595f4e8c98a","Type":"ContainerStarted","Data":"c22ba155265fb8a8ef6f614b5c2af8c95113f0f97178d22feb243e2baaf3d0a2"} Mar 18 13:51:47 crc kubenswrapper[4921]: I0318 13:51:47.769937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"733aa67f-6cfa-4a18-bc24-44e770159808","Type":"ContainerStarted","Data":"716978c53edcdc5f24ed6f31a6530364a5dc1c3b249962e86a44a29e52845b4a"} Mar 18 13:51:48 crc kubenswrapper[4921]: I0318 13:51:48.782324 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b42ed4c-386d-4808-924a-2595f4e8c98a","Type":"ContainerStarted","Data":"ae21f02f62dc6f9da537693cdce37b48c5f8aff9754444c7db6f68e42b3ba999"} Mar 18 13:51:48 crc kubenswrapper[4921]: I0318 13:51:48.785376 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"733aa67f-6cfa-4a18-bc24-44e770159808","Type":"ContainerStarted","Data":"ad8480c774c7c99bd33bff336edb51971ee6a9cdb00448ae539019d9c634c584"} Mar 18 13:51:48 crc kubenswrapper[4921]: I0318 13:51:48.854500 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.854205554 podStartE2EDuration="4.854205554s" podCreationTimestamp="2026-03-18 13:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:51:48.812760646 +0000 UTC m=+6128.362681295" watchObservedRunningTime="2026-03-18 13:51:48.854205554 +0000 UTC m=+6128.404126203" Mar 18 13:51:48 crc kubenswrapper[4921]: I0318 13:51:48.859920 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.859900487 podStartE2EDuration="4.859900487s" podCreationTimestamp="2026-03-18 13:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:51:48.842619822 +0000 UTC m=+6128.392540471" watchObservedRunningTime="2026-03-18 13:51:48.859900487 +0000 UTC m=+6128.409821126" Mar 18 13:51:49 crc kubenswrapper[4921]: I0318 13:51:49.052255 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-p9ntc"] Mar 18 13:51:49 crc kubenswrapper[4921]: I0318 13:51:49.065963 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-p9ntc"] Mar 18 13:51:49 crc kubenswrapper[4921]: I0318 13:51:49.224604 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c79fa37-ae65-4e13-a045-1e998aac23ce" path="/var/lib/kubelet/pods/5c79fa37-ae65-4e13-a045-1e998aac23ce/volumes" Mar 18 13:51:50 crc kubenswrapper[4921]: I0318 13:51:50.037447 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-dfce-account-create-update-mkhgn"] Mar 18 13:51:50 crc kubenswrapper[4921]: I0318 13:51:50.052632 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-dfce-account-create-update-mkhgn"] Mar 18 13:51:51 crc kubenswrapper[4921]: I0318 13:51:51.221657 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b3eb96-3132-4a26-bb74-45dc59627711" path="/var/lib/kubelet/pods/33b3eb96-3132-4a26-bb74-45dc59627711/volumes" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.370412 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.372076 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.385909 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.385953 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.407059 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.436154 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.439307 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.446349 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.663650 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.796819 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.857924 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.857963 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.857973 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:51:55 crc kubenswrapper[4921]: I0318 13:51:55.857982 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 13:51:56 crc kubenswrapper[4921]: I0318 13:51:56.037963 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lkbft"] Mar 18 13:51:56 crc kubenswrapper[4921]: I0318 13:51:56.052550 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lkbft"] Mar 18 13:51:56 crc kubenswrapper[4921]: I0318 13:51:56.472818 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.143:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.143:8080: connect: connection refused" Mar 18 13:51:57 crc kubenswrapper[4921]: I0318 13:51:57.219254 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8fb267-1b96-4722-90ae-94b07acfa50b" path="/var/lib/kubelet/pods/ca8fb267-1b96-4722-90ae-94b07acfa50b/volumes" Mar 18 13:51:57 crc kubenswrapper[4921]: I0318 13:51:57.882640 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:51:57 crc kubenswrapper[4921]: I0318 13:51:57.882667 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:51:58 crc kubenswrapper[4921]: I0318 13:51:58.135801 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:51:58 crc kubenswrapper[4921]: I0318 13:51:58.140833 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 13:51:58 crc kubenswrapper[4921]: I0318 13:51:58.182545 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:51:58 crc kubenswrapper[4921]: I0318 13:51:58.182673 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 13:51:58 crc kubenswrapper[4921]: I0318 13:51:58.186266 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.185599 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564032-q9trx"] Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.188990 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.194030 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.194254 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.194408 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.217663 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-q9trx"] Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.282364 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2f6x\" (UniqueName: \"kubernetes.io/projected/5341abdc-0c56-42ee-9f32-5058baeaedcc-kube-api-access-b2f6x\") pod \"auto-csr-approver-29564032-q9trx\" (UID: \"5341abdc-0c56-42ee-9f32-5058baeaedcc\") " pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.384689 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2f6x\" (UniqueName: \"kubernetes.io/projected/5341abdc-0c56-42ee-9f32-5058baeaedcc-kube-api-access-b2f6x\") pod \"auto-csr-approver-29564032-q9trx\" (UID: \"5341abdc-0c56-42ee-9f32-5058baeaedcc\") " pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.413846 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2f6x\" (UniqueName: \"kubernetes.io/projected/5341abdc-0c56-42ee-9f32-5058baeaedcc-kube-api-access-b2f6x\") pod \"auto-csr-approver-29564032-q9trx\" (UID: \"5341abdc-0c56-42ee-9f32-5058baeaedcc\") " pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:00 crc kubenswrapper[4921]: I0318 13:52:00.512359 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:01 crc kubenswrapper[4921]: I0318 13:52:01.000797 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-q9trx"] Mar 18 13:52:01 crc kubenswrapper[4921]: I0318 13:52:01.931896 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-q9trx" event={"ID":"5341abdc-0c56-42ee-9f32-5058baeaedcc","Type":"ContainerStarted","Data":"59276f8014745dff03dc81d5697fb6415fb725e9c9f71d904468f09bb9df506b"} Mar 18 13:52:04 crc kubenswrapper[4921]: I0318 13:52:04.962582 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-q9trx" event={"ID":"5341abdc-0c56-42ee-9f32-5058baeaedcc","Type":"ContainerStarted","Data":"c1c985eb23db6a055e2d1ea2a93a69fb4ec8d15b260cee77865accb089c9f492"} Mar 18 13:52:04 crc kubenswrapper[4921]: I0318 13:52:04.986028 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564032-q9trx" podStartSLOduration=1.461797027 podStartE2EDuration="4.986005668s" podCreationTimestamp="2026-03-18 13:52:00 +0000 UTC" firstStartedPulling="2026-03-18 13:52:01.005388564 +0000 UTC m=+6140.555309203" lastFinishedPulling="2026-03-18 13:52:04.529597205 +0000 UTC m=+6144.079517844" observedRunningTime="2026-03-18 13:52:04.976398662 +0000 UTC m=+6144.526319311" watchObservedRunningTime="2026-03-18 13:52:04.986005668 +0000 UTC m=+6144.535926307" Mar 18 13:52:05 crc kubenswrapper[4921]: I0318 13:52:05.796309 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Mar 18 13:52:05 crc kubenswrapper[4921]: I0318 13:52:05.976327 4921 generic.go:334] "Generic (PLEG): container finished" podID="5341abdc-0c56-42ee-9f32-5058baeaedcc" containerID="c1c985eb23db6a055e2d1ea2a93a69fb4ec8d15b260cee77865accb089c9f492" exitCode=0 Mar 18 13:52:05 crc kubenswrapper[4921]: I0318 13:52:05.976388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-q9trx" event={"ID":"5341abdc-0c56-42ee-9f32-5058baeaedcc","Type":"ContainerDied","Data":"c1c985eb23db6a055e2d1ea2a93a69fb4ec8d15b260cee77865accb089c9f492"} Mar 18 13:52:06 crc kubenswrapper[4921]: I0318 13:52:06.606508 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.143:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.143:8080: connect: connection refused" Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.375130 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.526251 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2f6x\" (UniqueName: \"kubernetes.io/projected/5341abdc-0c56-42ee-9f32-5058baeaedcc-kube-api-access-b2f6x\") pod \"5341abdc-0c56-42ee-9f32-5058baeaedcc\" (UID: \"5341abdc-0c56-42ee-9f32-5058baeaedcc\") " Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.533175 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5341abdc-0c56-42ee-9f32-5058baeaedcc-kube-api-access-b2f6x" (OuterVolumeSpecName: "kube-api-access-b2f6x") pod "5341abdc-0c56-42ee-9f32-5058baeaedcc" (UID: "5341abdc-0c56-42ee-9f32-5058baeaedcc"). InnerVolumeSpecName "kube-api-access-b2f6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.630461 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2f6x\" (UniqueName: \"kubernetes.io/projected/5341abdc-0c56-42ee-9f32-5058baeaedcc-kube-api-access-b2f6x\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.996102 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564032-q9trx" event={"ID":"5341abdc-0c56-42ee-9f32-5058baeaedcc","Type":"ContainerDied","Data":"59276f8014745dff03dc81d5697fb6415fb725e9c9f71d904468f09bb9df506b"} Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.996176 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564032-q9trx" Mar 18 13:52:07 crc kubenswrapper[4921]: I0318 13:52:07.996186 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59276f8014745dff03dc81d5697fb6415fb725e9c9f71d904468f09bb9df506b" Mar 18 13:52:08 crc kubenswrapper[4921]: I0318 13:52:08.045695 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-6mfj5"] Mar 18 13:52:08 crc kubenswrapper[4921]: I0318 13:52:08.055123 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-6mfj5"] Mar 18 13:52:09 crc kubenswrapper[4921]: I0318 13:52:09.226716 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30556c0c-ed1c-499b-92a9-bbda4534c1c7" path="/var/lib/kubelet/pods/30556c0c-ed1c-499b-92a9-bbda4534c1c7/volumes" Mar 18 13:52:11 crc kubenswrapper[4921]: I0318 13:52:11.706592 4921 scope.go:117] "RemoveContainer" containerID="d3859583c35b4254d7afb380a9fb6845b6aaa54f2e3f7f2fdd1e9da7d62b25c2" Mar 18 13:52:11 crc kubenswrapper[4921]: I0318 13:52:11.738525 4921 scope.go:117] "RemoveContainer" containerID="ac617f6c96fc3defd09220a35624ecd005f20833ab6bf85d916db1783ff2fbd5" Mar 18 13:52:11 crc kubenswrapper[4921]: I0318 13:52:11.785518 4921 scope.go:117] "RemoveContainer" containerID="42e230358254e0c58e862fcc3a40dcbe8a396e3eb1400160477b2f8a2a4ebfe4" Mar 18 13:52:11 crc kubenswrapper[4921]: I0318 13:52:11.853145 4921 scope.go:117] "RemoveContainer" containerID="dc48bb584ffeee1db218426789045e8bb44456274eba11a8ad6f76e174417222" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.080328 4921 generic.go:334] "Generic (PLEG): container finished" podID="81af2677-6026-4718-9cb2-298263fa78a8" containerID="03f807a3e7aca3cbb51d782e8215ec1dec1e8843a234a4379d83008204a1f311" exitCode=137 Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.080834 4921 generic.go:334] "Generic (PLEG): container finished" podID="81af2677-6026-4718-9cb2-298263fa78a8" containerID="d3ebcd79adb08594cbf1d98c32eebef6d55dacf4433a9918a1102827284e2fab" exitCode=137 Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.080857 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d67cf5f5-pcjjx" event={"ID":"81af2677-6026-4718-9cb2-298263fa78a8","Type":"ContainerDied","Data":"03f807a3e7aca3cbb51d782e8215ec1dec1e8843a234a4379d83008204a1f311"} Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.080917 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d67cf5f5-pcjjx" event={"ID":"81af2677-6026-4718-9cb2-298263fa78a8","Type":"ContainerDied","Data":"d3ebcd79adb08594cbf1d98c32eebef6d55dacf4433a9918a1102827284e2fab"} Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.265893 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.402372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/81af2677-6026-4718-9cb2-298263fa78a8-kube-api-access-t64wv\") pod \"81af2677-6026-4718-9cb2-298263fa78a8\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.402455 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-config-data\") pod \"81af2677-6026-4718-9cb2-298263fa78a8\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.402488 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-scripts\") pod \"81af2677-6026-4718-9cb2-298263fa78a8\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.402558 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81af2677-6026-4718-9cb2-298263fa78a8-horizon-secret-key\") pod \"81af2677-6026-4718-9cb2-298263fa78a8\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.402591 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81af2677-6026-4718-9cb2-298263fa78a8-logs\") pod \"81af2677-6026-4718-9cb2-298263fa78a8\" (UID: \"81af2677-6026-4718-9cb2-298263fa78a8\") " Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.403627 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81af2677-6026-4718-9cb2-298263fa78a8-logs" (OuterVolumeSpecName: "logs") pod "81af2677-6026-4718-9cb2-298263fa78a8" (UID: "81af2677-6026-4718-9cb2-298263fa78a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.409093 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81af2677-6026-4718-9cb2-298263fa78a8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "81af2677-6026-4718-9cb2-298263fa78a8" (UID: "81af2677-6026-4718-9cb2-298263fa78a8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.412271 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81af2677-6026-4718-9cb2-298263fa78a8-kube-api-access-t64wv" (OuterVolumeSpecName: "kube-api-access-t64wv") pod "81af2677-6026-4718-9cb2-298263fa78a8" (UID: "81af2677-6026-4718-9cb2-298263fa78a8"). InnerVolumeSpecName "kube-api-access-t64wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.429832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-scripts" (OuterVolumeSpecName: "scripts") pod "81af2677-6026-4718-9cb2-298263fa78a8" (UID: "81af2677-6026-4718-9cb2-298263fa78a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.430838 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-config-data" (OuterVolumeSpecName: "config-data") pod "81af2677-6026-4718-9cb2-298263fa78a8" (UID: "81af2677-6026-4718-9cb2-298263fa78a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.505033 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81af2677-6026-4718-9cb2-298263fa78a8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.505100 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/81af2677-6026-4718-9cb2-298263fa78a8-kube-api-access-t64wv\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.505142 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.505160 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af2677-6026-4718-9cb2-298263fa78a8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:16 crc kubenswrapper[4921]: I0318 13:52:16.505180 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/81af2677-6026-4718-9cb2-298263fa78a8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.081529 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.081845 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.092529 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d67cf5f5-pcjjx" event={"ID":"81af2677-6026-4718-9cb2-298263fa78a8","Type":"ContainerDied","Data":"bede1e66beb013abf46108e69c9e3f4c727459bb48b63b6b32011e66f78a369b"} Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.092573 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d67cf5f5-pcjjx" Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.092577 4921 scope.go:117] "RemoveContainer" containerID="03f807a3e7aca3cbb51d782e8215ec1dec1e8843a234a4379d83008204a1f311" Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.144367 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54d67cf5f5-pcjjx"] Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.155632 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54d67cf5f5-pcjjx"] Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.226579 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81af2677-6026-4718-9cb2-298263fa78a8" path="/var/lib/kubelet/pods/81af2677-6026-4718-9cb2-298263fa78a8/volumes" Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.265878 4921 scope.go:117] "RemoveContainer" containerID="d3ebcd79adb08594cbf1d98c32eebef6d55dacf4433a9918a1102827284e2fab" Mar 18 13:52:17 crc kubenswrapper[4921]: I0318 13:52:17.783764 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:52:18 crc kubenswrapper[4921]: I0318 13:52:18.338896 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:52:19 crc kubenswrapper[4921]: I0318 13:52:19.579823 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:52:20 crc kubenswrapper[4921]: I0318 13:52:20.219927 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:52:20 crc kubenswrapper[4921]: I0318 13:52:20.285991 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c4dbdc547-wznj5"] Mar 18 13:52:20 crc kubenswrapper[4921]: I0318 13:52:20.286276 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon-log" containerID="cri-o://8e9047c26a98572c655cc01cf8ca23483bc28f8fed0278e623abff69f97d46a2" gracePeriod=30 Mar 18 13:52:20 crc kubenswrapper[4921]: I0318 13:52:20.286309 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" containerID="cri-o://d9733068c0dea3979d23a04bee8869af33bdeee19cb08edaf3aebff7f36168bf" gracePeriod=30 Mar 18 13:52:24 crc kubenswrapper[4921]: I0318 13:52:24.047879 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dbvpz"] Mar 18 13:52:24 crc kubenswrapper[4921]: I0318 13:52:24.059257 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-aa34-account-create-update-h4nxm"] Mar 18 13:52:24 crc kubenswrapper[4921]: I0318 13:52:24.070199 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dbvpz"] Mar 18 13:52:24 crc kubenswrapper[4921]: I0318 13:52:24.077881 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-aa34-account-create-update-h4nxm"] Mar 18 13:52:24 crc kubenswrapper[4921]: I0318 13:52:24.170788 4921 generic.go:334] "Generic (PLEG): container finished" podID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerID="d9733068c0dea3979d23a04bee8869af33bdeee19cb08edaf3aebff7f36168bf" exitCode=0 Mar 18 13:52:24 crc kubenswrapper[4921]: I0318 13:52:24.170826 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4dbdc547-wznj5" event={"ID":"203cb28e-6e70-44c0-86f8-36a50a1d994f","Type":"ContainerDied","Data":"d9733068c0dea3979d23a04bee8869af33bdeee19cb08edaf3aebff7f36168bf"} Mar 18 13:52:25 crc kubenswrapper[4921]: I0318 13:52:25.220155 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9" path="/var/lib/kubelet/pods/1ec6a3cc-7b39-40f4-9977-8d0f12e44ea9/volumes" Mar 18 13:52:25 crc kubenswrapper[4921]: I0318 13:52:25.221456 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9360306-7a86-4b44-8284-9e5f9df08df6" path="/var/lib/kubelet/pods/e9360306-7a86-4b44-8284-9e5f9df08df6/volumes" Mar 18 13:52:25 crc kubenswrapper[4921]: I0318 13:52:25.795923 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Mar 18 13:52:34 crc kubenswrapper[4921]: I0318 13:52:34.033565 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rc4qq"] Mar 18 13:52:34 crc kubenswrapper[4921]: I0318 13:52:34.043812 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rc4qq"] Mar 18 13:52:35 crc kubenswrapper[4921]: I0318 13:52:35.222030 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fc5201-6049-4c9c-8e89-b6282f01a708" path="/var/lib/kubelet/pods/79fc5201-6049-4c9c-8e89-b6282f01a708/volumes" Mar 18 13:52:35 crc kubenswrapper[4921]: I0318 13:52:35.795919 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Mar 18 13:52:45 crc kubenswrapper[4921]: I0318 13:52:45.796411 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c4dbdc547-wznj5" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Mar 18 13:52:45 crc kubenswrapper[4921]: I0318 13:52:45.797060 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.081720 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.082140 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.082189 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.083024 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"038dd07b701a6099240a4d7ebb53358644804af29a553533cae1d5837f2d8758"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.083099 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://038dd07b701a6099240a4d7ebb53358644804af29a553533cae1d5837f2d8758" gracePeriod=600 Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.380553 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="038dd07b701a6099240a4d7ebb53358644804af29a553533cae1d5837f2d8758" exitCode=0 Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.380608 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"038dd07b701a6099240a4d7ebb53358644804af29a553533cae1d5837f2d8758"} Mar 18 13:52:47 crc kubenswrapper[4921]: I0318 13:52:47.380660 4921 scope.go:117] "RemoveContainer" containerID="6700666e4edd91f5b3890275812d0b0205e1a27cca132c60c9a7768aac19698d" Mar 18 13:52:48 crc kubenswrapper[4921]: I0318 13:52:48.396297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b"} Mar 18 13:52:50 crc kubenswrapper[4921]: I0318 13:52:50.415458 4921 generic.go:334] "Generic (PLEG): container finished" podID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerID="8e9047c26a98572c655cc01cf8ca23483bc28f8fed0278e623abff69f97d46a2" exitCode=137 Mar 18 13:52:50 crc kubenswrapper[4921]: I0318 13:52:50.415553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4dbdc547-wznj5" event={"ID":"203cb28e-6e70-44c0-86f8-36a50a1d994f","Type":"ContainerDied","Data":"8e9047c26a98572c655cc01cf8ca23483bc28f8fed0278e623abff69f97d46a2"} Mar 18 13:52:50 crc kubenswrapper[4921]: I0318 13:52:50.867407 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.034290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-config-data\") pod \"203cb28e-6e70-44c0-86f8-36a50a1d994f\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.034390 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/203cb28e-6e70-44c0-86f8-36a50a1d994f-horizon-secret-key\") pod \"203cb28e-6e70-44c0-86f8-36a50a1d994f\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.034497 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-scripts\") pod \"203cb28e-6e70-44c0-86f8-36a50a1d994f\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.034595 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtb9n\" (UniqueName: \"kubernetes.io/projected/203cb28e-6e70-44c0-86f8-36a50a1d994f-kube-api-access-rtb9n\") pod \"203cb28e-6e70-44c0-86f8-36a50a1d994f\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.034687 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/203cb28e-6e70-44c0-86f8-36a50a1d994f-logs\") pod \"203cb28e-6e70-44c0-86f8-36a50a1d994f\" (UID: \"203cb28e-6e70-44c0-86f8-36a50a1d994f\") " Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.035569 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203cb28e-6e70-44c0-86f8-36a50a1d994f-logs" (OuterVolumeSpecName: "logs") pod "203cb28e-6e70-44c0-86f8-36a50a1d994f" (UID: "203cb28e-6e70-44c0-86f8-36a50a1d994f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.047483 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203cb28e-6e70-44c0-86f8-36a50a1d994f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "203cb28e-6e70-44c0-86f8-36a50a1d994f" (UID: "203cb28e-6e70-44c0-86f8-36a50a1d994f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.062715 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203cb28e-6e70-44c0-86f8-36a50a1d994f-kube-api-access-rtb9n" (OuterVolumeSpecName: "kube-api-access-rtb9n") pod "203cb28e-6e70-44c0-86f8-36a50a1d994f" (UID: "203cb28e-6e70-44c0-86f8-36a50a1d994f"). InnerVolumeSpecName "kube-api-access-rtb9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.063736 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-config-data" (OuterVolumeSpecName: "config-data") pod "203cb28e-6e70-44c0-86f8-36a50a1d994f" (UID: "203cb28e-6e70-44c0-86f8-36a50a1d994f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.076792 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-scripts" (OuterVolumeSpecName: "scripts") pod "203cb28e-6e70-44c0-86f8-36a50a1d994f" (UID: "203cb28e-6e70-44c0-86f8-36a50a1d994f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.137416 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/203cb28e-6e70-44c0-86f8-36a50a1d994f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.137458 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.137470 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/203cb28e-6e70-44c0-86f8-36a50a1d994f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.137482 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203cb28e-6e70-44c0-86f8-36a50a1d994f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.137493 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtb9n\" (UniqueName: \"kubernetes.io/projected/203cb28e-6e70-44c0-86f8-36a50a1d994f-kube-api-access-rtb9n\") on node \"crc\" DevicePath \"\"" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.427922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4dbdc547-wznj5" event={"ID":"203cb28e-6e70-44c0-86f8-36a50a1d994f","Type":"ContainerDied","Data":"0d51813714c097a79840d02fd1e95b559932f794af33d3765bfe636580750044"} Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.428418 4921 scope.go:117] "RemoveContainer" containerID="d9733068c0dea3979d23a04bee8869af33bdeee19cb08edaf3aebff7f36168bf" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.428271 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4dbdc547-wznj5" Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.461967 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c4dbdc547-wznj5"] Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.471317 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c4dbdc547-wznj5"] Mar 18 13:52:51 crc kubenswrapper[4921]: I0318 13:52:51.597621 4921 scope.go:117] "RemoveContainer" containerID="8e9047c26a98572c655cc01cf8ca23483bc28f8fed0278e623abff69f97d46a2" Mar 18 13:52:53 crc kubenswrapper[4921]: I0318 13:52:53.227021 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" path="/var/lib/kubelet/pods/203cb28e-6e70-44c0-86f8-36a50a1d994f/volumes" Mar 18 13:53:12 crc kubenswrapper[4921]: I0318 13:53:12.029633 4921 scope.go:117] "RemoveContainer" containerID="e5f87ed0a043e3c96c25cca2f00f679dde038e65bbd9bbb43a4ceacf6f3ae8df" Mar 18 13:53:12 crc kubenswrapper[4921]: I0318 13:53:12.067058 4921 scope.go:117] "RemoveContainer" containerID="44879c9cc26ff2973864295bab118d591f421619c856cd9e6cc1f7278e4a8bc4" Mar 18 13:53:12 crc kubenswrapper[4921]: I0318 13:53:12.102993 4921 scope.go:117] "RemoveContainer" containerID="941b4171b6f2f6047e5707ce8db88ed30f91dcf9b095ba38f6f31015fe597182" Mar 18 13:53:17 crc kubenswrapper[4921]: I0318 13:53:17.043196 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-49xjt"] Mar 18 13:53:17 crc kubenswrapper[4921]: I0318 13:53:17.054887 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-071c-account-create-update-286tt"] Mar 18 13:53:17 crc kubenswrapper[4921]: I0318 13:53:17.066740 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-49xjt"] Mar 18 13:53:17 crc kubenswrapper[4921]: I0318 13:53:17.078633 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-071c-account-create-update-286tt"] Mar 18 13:53:17 crc kubenswrapper[4921]: I0318 13:53:17.227488 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4971dbff-763d-416b-8582-9eda46f0baeb" path="/var/lib/kubelet/pods/4971dbff-763d-416b-8582-9eda46f0baeb/volumes" Mar 18 13:53:17 crc kubenswrapper[4921]: I0318 13:53:17.228393 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9625f730-5f70-4a22-a0ed-cffc4bbd0ced" path="/var/lib/kubelet/pods/9625f730-5f70-4a22-a0ed-cffc4bbd0ced/volumes" Mar 18 13:53:25 crc kubenswrapper[4921]: I0318 13:53:25.031450 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-f92d5"] Mar 18 13:53:25 crc kubenswrapper[4921]: I0318 13:53:25.042412 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-f92d5"] Mar 18 13:53:25 crc kubenswrapper[4921]: I0318 13:53:25.220740 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919236cf-190f-434e-ad68-aea95427a765" path="/var/lib/kubelet/pods/919236cf-190f-434e-ad68-aea95427a765/volumes" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.993065 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77cf57cd59-b9zl7"] Mar 18 13:53:27 crc kubenswrapper[4921]: E0318 13:53:27.994126 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon-log" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994147 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon-log" Mar 18 13:53:27 crc kubenswrapper[4921]: E0318 13:53:27.994158 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994165 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" Mar 18 13:53:27 crc kubenswrapper[4921]: E0318 13:53:27.994188 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5341abdc-0c56-42ee-9f32-5058baeaedcc" containerName="oc" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994195 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5341abdc-0c56-42ee-9f32-5058baeaedcc" containerName="oc" Mar 18 13:53:27 crc kubenswrapper[4921]: E0318 13:53:27.994211 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon-log" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994218 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon-log" Mar 18 13:53:27 crc kubenswrapper[4921]: E0318 13:53:27.994238 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994248 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994438 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994456 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="81af2677-6026-4718-9cb2-298263fa78a8" containerName="horizon-log" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994465 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon-log" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994485 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5341abdc-0c56-42ee-9f32-5058baeaedcc" containerName="oc" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.994498 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="203cb28e-6e70-44c0-86f8-36a50a1d994f" containerName="horizon" Mar 18 13:53:27 crc kubenswrapper[4921]: I0318 13:53:27.995755 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.009329 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77cf57cd59-b9zl7"] Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.054458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/935c08b2-0b06-41d3-809c-55ead7884c9c-scripts\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.054535 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/935c08b2-0b06-41d3-809c-55ead7884c9c-logs\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.054647 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdkz\" (UniqueName: \"kubernetes.io/projected/935c08b2-0b06-41d3-809c-55ead7884c9c-kube-api-access-jjdkz\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.054669 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/935c08b2-0b06-41d3-809c-55ead7884c9c-horizon-secret-key\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.054720 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/935c08b2-0b06-41d3-809c-55ead7884c9c-config-data\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.157290 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdkz\" (UniqueName: \"kubernetes.io/projected/935c08b2-0b06-41d3-809c-55ead7884c9c-kube-api-access-jjdkz\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.157358 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/935c08b2-0b06-41d3-809c-55ead7884c9c-horizon-secret-key\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.157442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/935c08b2-0b06-41d3-809c-55ead7884c9c-config-data\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.157515 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/935c08b2-0b06-41d3-809c-55ead7884c9c-scripts\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.157576 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/935c08b2-0b06-41d3-809c-55ead7884c9c-logs\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.158125 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/935c08b2-0b06-41d3-809c-55ead7884c9c-logs\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.158708 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/935c08b2-0b06-41d3-809c-55ead7884c9c-scripts\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.158902 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/935c08b2-0b06-41d3-809c-55ead7884c9c-config-data\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.162896 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/935c08b2-0b06-41d3-809c-55ead7884c9c-horizon-secret-key\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.174051 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdkz\" (UniqueName: \"kubernetes.io/projected/935c08b2-0b06-41d3-809c-55ead7884c9c-kube-api-access-jjdkz\") pod \"horizon-77cf57cd59-b9zl7\" (UID: \"935c08b2-0b06-41d3-809c-55ead7884c9c\") " pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.364592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:28 crc kubenswrapper[4921]: I0318 13:53:28.970974 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77cf57cd59-b9zl7"] Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.652017 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-p8gcr"] Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.653888 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.662642 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-p8gcr"] Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.758479 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a95a-account-create-update-px4b8"] Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.759845 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.761406 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.778610 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a95a-account-create-update-px4b8"] Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.803707 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf57cd59-b9zl7" event={"ID":"935c08b2-0b06-41d3-809c-55ead7884c9c","Type":"ContainerStarted","Data":"a965d6d237bc9616a27a75362959bddcf1e1218f1d5d7fb519fd559522f1c324"} Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.803756 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf57cd59-b9zl7" event={"ID":"935c08b2-0b06-41d3-809c-55ead7884c9c","Type":"ContainerStarted","Data":"5ab1af4f5647c25da5957bd1dad151f684d8835616e2ec9d89ae0d5193ecf4ce"} Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.803768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cf57cd59-b9zl7" event={"ID":"935c08b2-0b06-41d3-809c-55ead7884c9c","Type":"ContainerStarted","Data":"47b698d3883a47d43bfdd9226ea9b675cab524b4cc170d03d4e5d8d377d55392"} Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.823801 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgwb\" (UniqueName: \"kubernetes.io/projected/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-kube-api-access-xlgwb\") pod \"heat-db-create-p8gcr\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.823916 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-operator-scripts\") pod \"heat-db-create-p8gcr\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.836245 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77cf57cd59-b9zl7" podStartSLOduration=2.836222714 podStartE2EDuration="2.836222714s" podCreationTimestamp="2026-03-18 13:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:53:29.832038304 +0000 UTC m=+6229.381958963" watchObservedRunningTime="2026-03-18 13:53:29.836222714 +0000 UTC m=+6229.386143353" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.926985 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-operator-scripts\") pod \"heat-db-create-p8gcr\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.927142 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkxgq\" (UniqueName: \"kubernetes.io/projected/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-kube-api-access-hkxgq\") pod \"heat-a95a-account-create-update-px4b8\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.927233 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-operator-scripts\") pod \"heat-a95a-account-create-update-px4b8\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.927415 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgwb\" (UniqueName: \"kubernetes.io/projected/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-kube-api-access-xlgwb\") pod \"heat-db-create-p8gcr\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.928858 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-operator-scripts\") pod \"heat-db-create-p8gcr\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.947995 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgwb\" (UniqueName: \"kubernetes.io/projected/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-kube-api-access-xlgwb\") pod \"heat-db-create-p8gcr\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:29 crc kubenswrapper[4921]: I0318 13:53:29.985592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.029371 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-operator-scripts\") pod \"heat-a95a-account-create-update-px4b8\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.029612 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkxgq\" (UniqueName: \"kubernetes.io/projected/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-kube-api-access-hkxgq\") pod \"heat-a95a-account-create-update-px4b8\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.030624 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-operator-scripts\") pod \"heat-a95a-account-create-update-px4b8\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.047304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkxgq\" (UniqueName: \"kubernetes.io/projected/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-kube-api-access-hkxgq\") pod \"heat-a95a-account-create-update-px4b8\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.105171 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.419131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-p8gcr"] Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.589824 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a95a-account-create-update-px4b8"] Mar 18 13:53:30 crc kubenswrapper[4921]: W0318 13:53:30.600787 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9443c0cc_9206_4b98_a5ca_d9ec17bdb1fa.slice/crio-311f026557fc2469d95c5e27e11acbc4bcb71f36761bdf4e37fcd6deb716b549 WatchSource:0}: Error finding container 311f026557fc2469d95c5e27e11acbc4bcb71f36761bdf4e37fcd6deb716b549: Status 404 returned error can't find the container with id 311f026557fc2469d95c5e27e11acbc4bcb71f36761bdf4e37fcd6deb716b549 Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.816033 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a95a-account-create-update-px4b8" event={"ID":"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa","Type":"ContainerStarted","Data":"acb77d81e41818c608172be5035eb3ec65c4978a866e71da8f463ce5c7e5af17"} Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.816094 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a95a-account-create-update-px4b8" event={"ID":"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa","Type":"ContainerStarted","Data":"311f026557fc2469d95c5e27e11acbc4bcb71f36761bdf4e37fcd6deb716b549"} Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.825124 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-p8gcr" event={"ID":"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5","Type":"ContainerStarted","Data":"59762b8a74b0604252b8c7c467b07edb6a5c799c5c10edc6f32fbea4c944fa70"} Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.825172 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-p8gcr" event={"ID":"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5","Type":"ContainerStarted","Data":"8b149e2adc35780e6ddf21cc2b5b5c980b58a02ae40ecc3cff02d9699e1cf689"} Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.837720 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-a95a-account-create-update-px4b8" podStartSLOduration=1.83769412 podStartE2EDuration="1.83769412s" podCreationTimestamp="2026-03-18 13:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:53:30.829276359 +0000 UTC m=+6230.379197018" watchObservedRunningTime="2026-03-18 13:53:30.83769412 +0000 UTC m=+6230.387614759" Mar 18 13:53:30 crc kubenswrapper[4921]: I0318 13:53:30.856800 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-p8gcr" podStartSLOduration=1.8567746170000001 podStartE2EDuration="1.856774617s" podCreationTimestamp="2026-03-18 13:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:53:30.845028091 +0000 UTC m=+6230.394948730" watchObservedRunningTime="2026-03-18 13:53:30.856774617 +0000 UTC m=+6230.406695256" Mar 18 13:53:31 crc kubenswrapper[4921]: I0318 13:53:31.833400 4921 generic.go:334] "Generic (PLEG): container finished" podID="9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" containerID="acb77d81e41818c608172be5035eb3ec65c4978a866e71da8f463ce5c7e5af17" exitCode=0 Mar 18 13:53:31 crc kubenswrapper[4921]: I0318 13:53:31.833475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a95a-account-create-update-px4b8" event={"ID":"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa","Type":"ContainerDied","Data":"acb77d81e41818c608172be5035eb3ec65c4978a866e71da8f463ce5c7e5af17"} Mar 18 13:53:31 crc kubenswrapper[4921]: I0318 13:53:31.836173 4921 generic.go:334] "Generic (PLEG): container finished" podID="def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" containerID="59762b8a74b0604252b8c7c467b07edb6a5c799c5c10edc6f32fbea4c944fa70" exitCode=0 Mar 18 13:53:31 crc kubenswrapper[4921]: I0318 13:53:31.836227 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-p8gcr" event={"ID":"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5","Type":"ContainerDied","Data":"59762b8a74b0604252b8c7c467b07edb6a5c799c5c10edc6f32fbea4c944fa70"} Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.338282 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.344738 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.509006 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgwb\" (UniqueName: \"kubernetes.io/projected/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-kube-api-access-xlgwb\") pod \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.509431 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-operator-scripts\") pod \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\" (UID: \"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5\") " Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.509590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkxgq\" (UniqueName: \"kubernetes.io/projected/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-kube-api-access-hkxgq\") pod \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.509619 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-operator-scripts\") pod \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\" (UID: \"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa\") " Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.510375 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" (UID: "def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.510571 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" (UID: "9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.515279 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-kube-api-access-xlgwb" (OuterVolumeSpecName: "kube-api-access-xlgwb") pod "def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" (UID: "def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5"). InnerVolumeSpecName "kube-api-access-xlgwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.516262 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-kube-api-access-hkxgq" (OuterVolumeSpecName: "kube-api-access-hkxgq") pod "9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" (UID: "9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa"). InnerVolumeSpecName "kube-api-access-hkxgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.612221 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkxgq\" (UniqueName: \"kubernetes.io/projected/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-kube-api-access-hkxgq\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.612254 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.612264 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgwb\" (UniqueName: \"kubernetes.io/projected/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-kube-api-access-xlgwb\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.612273 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.858873 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-p8gcr" event={"ID":"def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5","Type":"ContainerDied","Data":"8b149e2adc35780e6ddf21cc2b5b5c980b58a02ae40ecc3cff02d9699e1cf689"} Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.858913 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b149e2adc35780e6ddf21cc2b5b5c980b58a02ae40ecc3cff02d9699e1cf689" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.858988 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-p8gcr" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.861039 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a95a-account-create-update-px4b8" event={"ID":"9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa","Type":"ContainerDied","Data":"311f026557fc2469d95c5e27e11acbc4bcb71f36761bdf4e37fcd6deb716b549"} Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.861086 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="311f026557fc2469d95c5e27e11acbc4bcb71f36761bdf4e37fcd6deb716b549" Mar 18 13:53:33 crc kubenswrapper[4921]: I0318 13:53:33.861186 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a95a-account-create-update-px4b8" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.919475 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-qfr9v"] Mar 18 13:53:34 crc kubenswrapper[4921]: E0318 13:53:34.920257 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" containerName="mariadb-account-create-update" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.920278 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" containerName="mariadb-account-create-update" Mar 18 13:53:34 crc kubenswrapper[4921]: E0318 13:53:34.920293 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" containerName="mariadb-database-create" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.920299 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" containerName="mariadb-database-create" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.920516 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" containerName="mariadb-account-create-update" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.920534 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" containerName="mariadb-database-create" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.921211 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.925184 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.925621 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-l2sr4" Mar 18 13:53:34 crc kubenswrapper[4921]: I0318 13:53:34.931174 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-qfr9v"] Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.038698 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbd4j\" (UniqueName: \"kubernetes.io/projected/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-kube-api-access-xbd4j\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.038894 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-combined-ca-bundle\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.038982 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-config-data\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.141144 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbd4j\" (UniqueName: \"kubernetes.io/projected/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-kube-api-access-xbd4j\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.141301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-combined-ca-bundle\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.141410 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-config-data\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.151432 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-combined-ca-bundle\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.151763 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-config-data\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.160871 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbd4j\" (UniqueName: \"kubernetes.io/projected/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-kube-api-access-xbd4j\") pod \"heat-db-sync-qfr9v\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.239515 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.782648 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-qfr9v"] Mar 18 13:53:35 crc kubenswrapper[4921]: I0318 13:53:35.879850 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qfr9v" event={"ID":"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f","Type":"ContainerStarted","Data":"8211833ab3130c2f049316c585b12bff12403dfac440b11903c5cfc19b8584dc"} Mar 18 13:53:38 crc kubenswrapper[4921]: I0318 13:53:38.366507 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:38 crc kubenswrapper[4921]: I0318 13:53:38.370010 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:53:43 crc kubenswrapper[4921]: I0318 13:53:43.977854 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qfr9v" event={"ID":"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f","Type":"ContainerStarted","Data":"177a4eeeb90b5611d6ad43d0683aabcdba4e5b7c4fc91edc4b0c40a1ddf69884"} Mar 18 13:53:43 crc kubenswrapper[4921]: I0318 13:53:43.995699 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-qfr9v" podStartSLOduration=2.5844029170000002 podStartE2EDuration="9.995673119s" podCreationTimestamp="2026-03-18 13:53:34 +0000 UTC" firstStartedPulling="2026-03-18 13:53:35.785299411 +0000 UTC m=+6235.335220050" lastFinishedPulling="2026-03-18 13:53:43.196569613 +0000 UTC m=+6242.746490252" observedRunningTime="2026-03-18 13:53:43.990693336 +0000 UTC m=+6243.540613975" watchObservedRunningTime="2026-03-18 13:53:43.995673119 +0000 UTC m=+6243.545593758" Mar 18 13:53:48 crc kubenswrapper[4921]: I0318 13:53:48.020752 4921 generic.go:334] "Generic (PLEG): container finished" podID="49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" containerID="177a4eeeb90b5611d6ad43d0683aabcdba4e5b7c4fc91edc4b0c40a1ddf69884" exitCode=0 Mar 18 13:53:48 crc kubenswrapper[4921]: I0318 13:53:48.021399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qfr9v" event={"ID":"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f","Type":"ContainerDied","Data":"177a4eeeb90b5611d6ad43d0683aabcdba4e5b7c4fc91edc4b0c40a1ddf69884"} Mar 18 13:53:48 crc kubenswrapper[4921]: I0318 13:53:48.366476 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-77cf57cd59-b9zl7" podUID="935c08b2-0b06-41d3-809c-55ead7884c9c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.147:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.147:8080: connect: connection refused" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.449956 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.580432 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-combined-ca-bundle\") pod \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.580544 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbd4j\" (UniqueName: \"kubernetes.io/projected/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-kube-api-access-xbd4j\") pod \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.580650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-config-data\") pod \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\" (UID: \"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f\") " Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.586381 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-kube-api-access-xbd4j" (OuterVolumeSpecName: "kube-api-access-xbd4j") pod "49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" (UID: "49dbf0c7-4c4a-484b-ab76-fb64ea938c0f"). InnerVolumeSpecName "kube-api-access-xbd4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.608330 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" (UID: "49dbf0c7-4c4a-484b-ab76-fb64ea938c0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.668335 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-config-data" (OuterVolumeSpecName: "config-data") pod "49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" (UID: "49dbf0c7-4c4a-484b-ab76-fb64ea938c0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.682603 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbd4j\" (UniqueName: \"kubernetes.io/projected/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-kube-api-access-xbd4j\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.682674 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:49 crc kubenswrapper[4921]: I0318 13:53:49.682690 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:53:50 crc kubenswrapper[4921]: I0318 13:53:50.040102 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-qfr9v" event={"ID":"49dbf0c7-4c4a-484b-ab76-fb64ea938c0f","Type":"ContainerDied","Data":"8211833ab3130c2f049316c585b12bff12403dfac440b11903c5cfc19b8584dc"} Mar 18 13:53:50 crc kubenswrapper[4921]: I0318 13:53:50.040166 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-qfr9v" Mar 18 13:53:50 crc kubenswrapper[4921]: I0318 13:53:50.040180 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8211833ab3130c2f049316c585b12bff12403dfac440b11903c5cfc19b8584dc" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.146236 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-79569bdc57-vlxb2"] Mar 18 13:53:51 crc kubenswrapper[4921]: E0318 13:53:51.146918 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" containerName="heat-db-sync" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.146931 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" containerName="heat-db-sync" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.147135 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" containerName="heat-db-sync" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.147946 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.151861 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-l2sr4" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.151941 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.152068 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.163514 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79569bdc57-vlxb2"] Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.250184 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-57db5945bb-wkc7x"] Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.253814 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.256850 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.278416 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57db5945bb-wkc7x"] Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.321233 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptrk\" (UniqueName: \"kubernetes.io/projected/5876eeea-8bcc-4e40-8775-a9c183f68bed-kube-api-access-bptrk\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.321307 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-config-data\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.321383 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-combined-ca-bundle\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.321439 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-config-data-custom\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.344349 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-58c44c44b9-4hccn"] Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.345755 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.348130 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.349244 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58c44c44b9-4hccn"] Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.422946 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-config-data\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423045 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbp5v\" (UniqueName: \"kubernetes.io/projected/390d1c49-9e71-4990-9fe9-d98fdd866322-kube-api-access-fbp5v\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423077 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-combined-ca-bundle\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423124 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbznj\" (UniqueName: \"kubernetes.io/projected/9cd166a7-f46c-4049-a692-3e0d3ba12606-kube-api-access-jbznj\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423144 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-combined-ca-bundle\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-config-data\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423224 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-config-data-custom\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423251 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-config-data-custom\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423284 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-config-data\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423305 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-combined-ca-bundle\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423357 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptrk\" (UniqueName: \"kubernetes.io/projected/5876eeea-8bcc-4e40-8775-a9c183f68bed-kube-api-access-bptrk\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.423398 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-config-data-custom\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.429218 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-config-data-custom\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.430138 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-config-data\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.430710 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5876eeea-8bcc-4e40-8775-a9c183f68bed-combined-ca-bundle\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.443776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptrk\" (UniqueName: \"kubernetes.io/projected/5876eeea-8bcc-4e40-8775-a9c183f68bed-kube-api-access-bptrk\") pod \"heat-engine-79569bdc57-vlxb2\" (UID: \"5876eeea-8bcc-4e40-8775-a9c183f68bed\") " pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.479508 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525035 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-config-data-custom\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbp5v\" (UniqueName: \"kubernetes.io/projected/390d1c49-9e71-4990-9fe9-d98fdd866322-kube-api-access-fbp5v\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525180 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-combined-ca-bundle\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525218 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbznj\" (UniqueName: \"kubernetes.io/projected/9cd166a7-f46c-4049-a692-3e0d3ba12606-kube-api-access-jbznj\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-config-data\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525329 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-config-data-custom\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.525370 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-config-data\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.526142 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-combined-ca-bundle\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.531406 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-config-data-custom\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.532329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-combined-ca-bundle\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.533971 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-config-data-custom\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.535371 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-config-data\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.536022 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390d1c49-9e71-4990-9fe9-d98fdd866322-combined-ca-bundle\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.545374 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd166a7-f46c-4049-a692-3e0d3ba12606-config-data\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.549874 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbp5v\" (UniqueName: \"kubernetes.io/projected/390d1c49-9e71-4990-9fe9-d98fdd866322-kube-api-access-fbp5v\") pod \"heat-api-58c44c44b9-4hccn\" (UID: \"390d1c49-9e71-4990-9fe9-d98fdd866322\") " pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.551006 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbznj\" (UniqueName: \"kubernetes.io/projected/9cd166a7-f46c-4049-a692-3e0d3ba12606-kube-api-access-jbznj\") pod \"heat-cfnapi-57db5945bb-wkc7x\" (UID: \"9cd166a7-f46c-4049-a692-3e0d3ba12606\") " pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.582831 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.670574 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:51 crc kubenswrapper[4921]: I0318 13:53:51.985866 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79569bdc57-vlxb2"] Mar 18 13:53:52 crc kubenswrapper[4921]: I0318 13:53:52.072641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79569bdc57-vlxb2" event={"ID":"5876eeea-8bcc-4e40-8775-a9c183f68bed","Type":"ContainerStarted","Data":"ac67d95c0f841fb55820ff93156e67ad891fc16a8f4a44e19da45d1d4976019d"} Mar 18 13:53:52 crc kubenswrapper[4921]: I0318 13:53:52.177024 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57db5945bb-wkc7x"] Mar 18 13:53:52 crc kubenswrapper[4921]: I0318 13:53:52.244482 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58c44c44b9-4hccn"] Mar 18 13:53:52 crc kubenswrapper[4921]: W0318 13:53:52.247596 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod390d1c49_9e71_4990_9fe9_d98fdd866322.slice/crio-54b181aa3f5b85ac9b229660ee0ba9bf7452947997cfdf91398284dd62f77bc8 WatchSource:0}: Error finding container 54b181aa3f5b85ac9b229660ee0ba9bf7452947997cfdf91398284dd62f77bc8: Status 404 returned error can't find the container with id 54b181aa3f5b85ac9b229660ee0ba9bf7452947997cfdf91398284dd62f77bc8 Mar 18 13:53:53 crc kubenswrapper[4921]: I0318 13:53:53.086099 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58c44c44b9-4hccn" event={"ID":"390d1c49-9e71-4990-9fe9-d98fdd866322","Type":"ContainerStarted","Data":"54b181aa3f5b85ac9b229660ee0ba9bf7452947997cfdf91398284dd62f77bc8"} Mar 18 13:53:53 crc kubenswrapper[4921]: I0318 13:53:53.088397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79569bdc57-vlxb2" event={"ID":"5876eeea-8bcc-4e40-8775-a9c183f68bed","Type":"ContainerStarted","Data":"b6a51fb3d001d317729633774491580781b748b477e4eed7f0e1c06dc5575b50"} Mar 18 13:53:53 crc kubenswrapper[4921]: I0318 13:53:53.089872 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:53:53 crc kubenswrapper[4921]: I0318 13:53:53.094764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" event={"ID":"9cd166a7-f46c-4049-a692-3e0d3ba12606","Type":"ContainerStarted","Data":"a3a8de6ed355edb017b184d8e74cbf306dbf2194154bdff1153653f508d67d79"} Mar 18 13:53:53 crc kubenswrapper[4921]: I0318 13:53:53.109721 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-79569bdc57-vlxb2" podStartSLOduration=2.109697209 podStartE2EDuration="2.109697209s" podCreationTimestamp="2026-03-18 13:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:53:53.107593969 +0000 UTC m=+6252.657514608" watchObservedRunningTime="2026-03-18 13:53:53.109697209 +0000 UTC m=+6252.659617848" Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.048344 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-053c-account-create-update-hg5sk"] Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.061541 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-68l5k"] Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.070962 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-053c-account-create-update-hg5sk"] Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.079655 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-68l5k"] Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.113795 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" event={"ID":"9cd166a7-f46c-4049-a692-3e0d3ba12606","Type":"ContainerStarted","Data":"de4fb2884436fa772f293a36a51c0c9b19c2538f2640f170cf29e6ab1a6f08fe"} Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.115275 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.122271 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58c44c44b9-4hccn" event={"ID":"390d1c49-9e71-4990-9fe9-d98fdd866322","Type":"ContainerStarted","Data":"efb6a073575bf20b9ca752a8b372ea37861bc64b7f4d00ee425eb1ba813c8407"} Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.122422 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.136628 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" podStartSLOduration=1.57102904 podStartE2EDuration="4.136611891s" podCreationTimestamp="2026-03-18 13:53:51 +0000 UTC" firstStartedPulling="2026-03-18 13:53:52.18381516 +0000 UTC m=+6251.733735799" lastFinishedPulling="2026-03-18 13:53:54.749398021 +0000 UTC m=+6254.299318650" observedRunningTime="2026-03-18 13:53:55.133764579 +0000 UTC m=+6254.683685228" watchObservedRunningTime="2026-03-18 13:53:55.136611891 +0000 UTC m=+6254.686532530" Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.160399 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-58c44c44b9-4hccn" podStartSLOduration=1.6591397350000001 podStartE2EDuration="4.160380032s" podCreationTimestamp="2026-03-18 13:53:51 +0000 UTC" firstStartedPulling="2026-03-18 13:53:52.249932465 +0000 UTC m=+6251.799853104" lastFinishedPulling="2026-03-18 13:53:54.751172762 +0000 UTC m=+6254.301093401" observedRunningTime="2026-03-18 13:53:55.150726175 +0000 UTC m=+6254.700646814" watchObservedRunningTime="2026-03-18 13:53:55.160380032 +0000 UTC m=+6254.710300671" Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.231863 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c9e083a-07ce-4f69-87aa-0fe7589ddeff" path="/var/lib/kubelet/pods/1c9e083a-07ce-4f69-87aa-0fe7589ddeff/volumes" Mar 18 13:53:55 crc kubenswrapper[4921]: I0318 13:53:55.232541 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4754cb8-9b80-4f83-a7eb-16895f1b3dee" path="/var/lib/kubelet/pods/a4754cb8-9b80-4f83-a7eb-16895f1b3dee/volumes" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.145882 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564034-lbk74"] Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.147665 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.151325 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.151545 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.151775 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.153996 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-lbk74"] Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.283442 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.327134 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhnb\" (UniqueName: \"kubernetes.io/projected/fdb493be-0607-4152-ac1d-08fc9ff76476-kube-api-access-6zhnb\") pod \"auto-csr-approver-29564034-lbk74\" (UID: \"fdb493be-0607-4152-ac1d-08fc9ff76476\") " pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.429520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zhnb\" (UniqueName: \"kubernetes.io/projected/fdb493be-0607-4152-ac1d-08fc9ff76476-kube-api-access-6zhnb\") pod \"auto-csr-approver-29564034-lbk74\" (UID: \"fdb493be-0607-4152-ac1d-08fc9ff76476\") " pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.446055 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zhnb\" (UniqueName: \"kubernetes.io/projected/fdb493be-0607-4152-ac1d-08fc9ff76476-kube-api-access-6zhnb\") pod \"auto-csr-approver-29564034-lbk74\" (UID: \"fdb493be-0607-4152-ac1d-08fc9ff76476\") " pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.477411 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:00 crc kubenswrapper[4921]: I0318 13:54:00.961236 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-lbk74"] Mar 18 13:54:01 crc kubenswrapper[4921]: I0318 13:54:01.043975 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zc4tr"] Mar 18 13:54:01 crc kubenswrapper[4921]: I0318 13:54:01.057136 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zc4tr"] Mar 18 13:54:01 crc kubenswrapper[4921]: I0318 13:54:01.199136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564034-lbk74" event={"ID":"fdb493be-0607-4152-ac1d-08fc9ff76476","Type":"ContainerStarted","Data":"b8ae604de963571657647c92bf45c8aa270cf3a67e79f1384e84e7250d131266"} Mar 18 13:54:01 crc kubenswrapper[4921]: I0318 13:54:01.220853 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6f701e-7f40-4029-b66b-c840042d5058" path="/var/lib/kubelet/pods/6b6f701e-7f40-4029-b66b-c840042d5058/volumes" Mar 18 13:54:02 crc kubenswrapper[4921]: I0318 13:54:02.176689 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77cf57cd59-b9zl7" Mar 18 13:54:02 crc kubenswrapper[4921]: I0318 13:54:02.244167 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79659f955c-rqc2t"] Mar 18 13:54:02 crc kubenswrapper[4921]: I0318 13:54:02.244440 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon-log" containerID="cri-o://363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f" gracePeriod=30 Mar 18 13:54:02 crc kubenswrapper[4921]: I0318 13:54:02.245036 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" containerID="cri-o://c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4" gracePeriod=30 Mar 18 13:54:03 crc kubenswrapper[4921]: I0318 13:54:03.142455 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-57db5945bb-wkc7x" Mar 18 13:54:03 crc kubenswrapper[4921]: I0318 13:54:03.223331 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-58c44c44b9-4hccn" Mar 18 13:54:03 crc kubenswrapper[4921]: I0318 13:54:03.231753 4921 generic.go:334] "Generic (PLEG): container finished" podID="fdb493be-0607-4152-ac1d-08fc9ff76476" containerID="fc5d89e6e4e55ee3c69178f99812539c7d8b1efa31c39027bfbb24cc78746eb4" exitCode=0 Mar 18 13:54:03 crc kubenswrapper[4921]: I0318 13:54:03.231806 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564034-lbk74" event={"ID":"fdb493be-0607-4152-ac1d-08fc9ff76476","Type":"ContainerDied","Data":"fc5d89e6e4e55ee3c69178f99812539c7d8b1efa31c39027bfbb24cc78746eb4"} Mar 18 13:54:05 crc kubenswrapper[4921]: I0318 13:54:05.409911 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:05 crc kubenswrapper[4921]: I0318 13:54:05.469974 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zhnb\" (UniqueName: \"kubernetes.io/projected/fdb493be-0607-4152-ac1d-08fc9ff76476-kube-api-access-6zhnb\") pod \"fdb493be-0607-4152-ac1d-08fc9ff76476\" (UID: \"fdb493be-0607-4152-ac1d-08fc9ff76476\") " Mar 18 13:54:05 crc kubenswrapper[4921]: I0318 13:54:05.476242 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb493be-0607-4152-ac1d-08fc9ff76476-kube-api-access-6zhnb" (OuterVolumeSpecName: "kube-api-access-6zhnb") pod "fdb493be-0607-4152-ac1d-08fc9ff76476" (UID: "fdb493be-0607-4152-ac1d-08fc9ff76476"). InnerVolumeSpecName "kube-api-access-6zhnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:54:05 crc kubenswrapper[4921]: I0318 13:54:05.572023 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zhnb\" (UniqueName: \"kubernetes.io/projected/fdb493be-0607-4152-ac1d-08fc9ff76476-kube-api-access-6zhnb\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.006504 4921 generic.go:334] "Generic (PLEG): container finished" podID="22516266-aa62-413f-bf35-5aa0add4af9e" containerID="c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4" exitCode=0 Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.006604 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79659f955c-rqc2t" event={"ID":"22516266-aa62-413f-bf35-5aa0add4af9e","Type":"ContainerDied","Data":"c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4"} Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.009386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564034-lbk74" event={"ID":"fdb493be-0607-4152-ac1d-08fc9ff76476","Type":"ContainerDied","Data":"b8ae604de963571657647c92bf45c8aa270cf3a67e79f1384e84e7250d131266"} Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.009415 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564034-lbk74" Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.009429 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8ae604de963571657647c92bf45c8aa270cf3a67e79f1384e84e7250d131266" Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.469488 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-ddjgn"] Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.470876 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.143:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.143:8080: connect: connection refused" Mar 18 13:54:06 crc kubenswrapper[4921]: I0318 13:54:06.478307 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-ddjgn"] Mar 18 13:54:07 crc kubenswrapper[4921]: I0318 13:54:07.222245 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97dab222-fe06-4654-bf9a-49f3790bbebf" path="/var/lib/kubelet/pods/97dab222-fe06-4654-bf9a-49f3790bbebf/volumes" Mar 18 13:54:11 crc kubenswrapper[4921]: I0318 13:54:11.507814 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-79569bdc57-vlxb2" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.252607 4921 scope.go:117] "RemoveContainer" containerID="0718ca0d26de2c05a90e0117cabaebba552eb1e052bd89eb788ca16e756cf116" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.279544 4921 scope.go:117] "RemoveContainer" containerID="ef7676a51c2301c5ce6d42d1d4cfa3e00deb06de83ac8c6c1cc3d8cb37fa41f9" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.341080 4921 scope.go:117] "RemoveContainer" containerID="fba308647055a187ea1e18c06688340a871327483b7f6cf316017ba58e32e4eb" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.423033 4921 scope.go:117] "RemoveContainer" containerID="a0530887b7045ec4642bb8ed83eee2562c06233d0bd83e2d7ff375f137eee036" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.445385 4921 scope.go:117] "RemoveContainer" containerID="94df89f5af2eabe34ca7f1487ad99b61eb23822abafe4a0d77816af0aa322849" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.492001 4921 scope.go:117] "RemoveContainer" containerID="1929ca627aecea3020096d032d97e843b923a9738afafa42b444d8167df160cb" Mar 18 13:54:12 crc kubenswrapper[4921]: I0318 13:54:12.548952 4921 scope.go:117] "RemoveContainer" containerID="02aa7b07daca54c57e9e50efeb5e7759d15368ba06efab11df2076e0875971d5" Mar 18 13:54:16 crc kubenswrapper[4921]: I0318 13:54:16.472360 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.143:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.143:8080: connect: connection refused" Mar 18 13:54:24 crc kubenswrapper[4921]: I0318 13:54:24.962862 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr"] Mar 18 13:54:24 crc kubenswrapper[4921]: E0318 13:54:24.963850 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb493be-0607-4152-ac1d-08fc9ff76476" containerName="oc" Mar 18 13:54:24 crc kubenswrapper[4921]: I0318 13:54:24.963866 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb493be-0607-4152-ac1d-08fc9ff76476" containerName="oc" Mar 18 13:54:24 crc kubenswrapper[4921]: I0318 13:54:24.964141 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb493be-0607-4152-ac1d-08fc9ff76476" containerName="oc" Mar 18 13:54:24 crc kubenswrapper[4921]: I0318 13:54:24.965506 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:24 crc kubenswrapper[4921]: I0318 13:54:24.969894 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 13:54:24 crc kubenswrapper[4921]: I0318 13:54:24.977233 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr"] Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.094062 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.094196 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmkk\" (UniqueName: \"kubernetes.io/projected/f14ebc50-775b-49df-85a7-f3e9c736b149-kube-api-access-mxmkk\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.094278 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.195537 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.195713 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.195757 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmkk\" (UniqueName: \"kubernetes.io/projected/f14ebc50-775b-49df-85a7-f3e9c736b149-kube-api-access-mxmkk\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.196560 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.196570 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.222284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmkk\" (UniqueName: \"kubernetes.io/projected/f14ebc50-775b-49df-85a7-f3e9c736b149-kube-api-access-mxmkk\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.291623 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:25 crc kubenswrapper[4921]: I0318 13:54:25.869100 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr"] Mar 18 13:54:26 crc kubenswrapper[4921]: I0318 13:54:26.210588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" event={"ID":"f14ebc50-775b-49df-85a7-f3e9c736b149","Type":"ContainerStarted","Data":"c6a4ad93de7b97f1600d957d8af161193b434e8b4970b4c418acc98df6851871"} Mar 18 13:54:26 crc kubenswrapper[4921]: I0318 13:54:26.211090 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" event={"ID":"f14ebc50-775b-49df-85a7-f3e9c736b149","Type":"ContainerStarted","Data":"b2cb4457e848b80eb8df1f0d5aece0413aa9da5f317cc116f186de2f220060f4"} Mar 18 13:54:26 crc kubenswrapper[4921]: I0318 13:54:26.471553 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79659f955c-rqc2t" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.143:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.143:8080: connect: connection refused" Mar 18 13:54:26 crc kubenswrapper[4921]: I0318 13:54:26.471667 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:54:27 crc kubenswrapper[4921]: I0318 13:54:27.222759 4921 generic.go:334] "Generic (PLEG): container finished" podID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerID="c6a4ad93de7b97f1600d957d8af161193b434e8b4970b4c418acc98df6851871" exitCode=0 Mar 18 13:54:27 crc kubenswrapper[4921]: I0318 13:54:27.222856 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" event={"ID":"f14ebc50-775b-49df-85a7-f3e9c736b149","Type":"ContainerDied","Data":"c6a4ad93de7b97f1600d957d8af161193b434e8b4970b4c418acc98df6851871"} Mar 18 13:54:29 crc kubenswrapper[4921]: I0318 13:54:29.240297 4921 generic.go:334] "Generic (PLEG): container finished" podID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerID="7946c3bb60572a7df6f9ce7618e101a542bf59df05f57585df1be55d1089a482" exitCode=0 Mar 18 13:54:29 crc kubenswrapper[4921]: I0318 13:54:29.240372 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" event={"ID":"f14ebc50-775b-49df-85a7-f3e9c736b149","Type":"ContainerDied","Data":"7946c3bb60572a7df6f9ce7618e101a542bf59df05f57585df1be55d1089a482"} Mar 18 13:54:30 crc kubenswrapper[4921]: I0318 13:54:30.252908 4921 generic.go:334] "Generic (PLEG): container finished" podID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerID="35076ca42e3838895bdbd17a5f78b50b7476e1d8d526f9208e2956d95aeed27d" exitCode=0 Mar 18 13:54:30 crc kubenswrapper[4921]: I0318 13:54:30.253164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" event={"ID":"f14ebc50-775b-49df-85a7-f3e9c736b149","Type":"ContainerDied","Data":"35076ca42e3838895bdbd17a5f78b50b7476e1d8d526f9208e2956d95aeed27d"} Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.645504 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.836258 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-bundle\") pod \"f14ebc50-775b-49df-85a7-f3e9c736b149\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.836409 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxmkk\" (UniqueName: \"kubernetes.io/projected/f14ebc50-775b-49df-85a7-f3e9c736b149-kube-api-access-mxmkk\") pod \"f14ebc50-775b-49df-85a7-f3e9c736b149\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.836487 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-util\") pod \"f14ebc50-775b-49df-85a7-f3e9c736b149\" (UID: \"f14ebc50-775b-49df-85a7-f3e9c736b149\") " Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.838336 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-bundle" (OuterVolumeSpecName: "bundle") pod "f14ebc50-775b-49df-85a7-f3e9c736b149" (UID: "f14ebc50-775b-49df-85a7-f3e9c736b149"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.847008 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-util" (OuterVolumeSpecName: "util") pod "f14ebc50-775b-49df-85a7-f3e9c736b149" (UID: "f14ebc50-775b-49df-85a7-f3e9c736b149"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.850939 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f14ebc50-775b-49df-85a7-f3e9c736b149-kube-api-access-mxmkk" (OuterVolumeSpecName: "kube-api-access-mxmkk") pod "f14ebc50-775b-49df-85a7-f3e9c736b149" (UID: "f14ebc50-775b-49df-85a7-f3e9c736b149"). InnerVolumeSpecName "kube-api-access-mxmkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.939283 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.939313 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxmkk\" (UniqueName: \"kubernetes.io/projected/f14ebc50-775b-49df-85a7-f3e9c736b149-kube-api-access-mxmkk\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:31 crc kubenswrapper[4921]: I0318 13:54:31.939324 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f14ebc50-775b-49df-85a7-f3e9c736b149-util\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.273523 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" event={"ID":"f14ebc50-775b-49df-85a7-f3e9c736b149","Type":"ContainerDied","Data":"b2cb4457e848b80eb8df1f0d5aece0413aa9da5f317cc116f186de2f220060f4"} Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.273576 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2cb4457e848b80eb8df1f0d5aece0413aa9da5f317cc116f186de2f220060f4" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.273583 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.633149 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.761763 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkqt\" (UniqueName: \"kubernetes.io/projected/22516266-aa62-413f-bf35-5aa0add4af9e-kube-api-access-nqkqt\") pod \"22516266-aa62-413f-bf35-5aa0add4af9e\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.761828 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22516266-aa62-413f-bf35-5aa0add4af9e-horizon-secret-key\") pod \"22516266-aa62-413f-bf35-5aa0add4af9e\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.761958 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22516266-aa62-413f-bf35-5aa0add4af9e-logs\") pod \"22516266-aa62-413f-bf35-5aa0add4af9e\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.762020 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-scripts\") pod \"22516266-aa62-413f-bf35-5aa0add4af9e\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.762088 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-config-data\") pod \"22516266-aa62-413f-bf35-5aa0add4af9e\" (UID: \"22516266-aa62-413f-bf35-5aa0add4af9e\") " Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.762882 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22516266-aa62-413f-bf35-5aa0add4af9e-logs" (OuterVolumeSpecName: "logs") pod "22516266-aa62-413f-bf35-5aa0add4af9e" (UID: "22516266-aa62-413f-bf35-5aa0add4af9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.767193 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22516266-aa62-413f-bf35-5aa0add4af9e-kube-api-access-nqkqt" (OuterVolumeSpecName: "kube-api-access-nqkqt") pod "22516266-aa62-413f-bf35-5aa0add4af9e" (UID: "22516266-aa62-413f-bf35-5aa0add4af9e"). InnerVolumeSpecName "kube-api-access-nqkqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.768105 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22516266-aa62-413f-bf35-5aa0add4af9e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "22516266-aa62-413f-bf35-5aa0add4af9e" (UID: "22516266-aa62-413f-bf35-5aa0add4af9e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.786277 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-scripts" (OuterVolumeSpecName: "scripts") pod "22516266-aa62-413f-bf35-5aa0add4af9e" (UID: "22516266-aa62-413f-bf35-5aa0add4af9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.788778 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-config-data" (OuterVolumeSpecName: "config-data") pod "22516266-aa62-413f-bf35-5aa0add4af9e" (UID: "22516266-aa62-413f-bf35-5aa0add4af9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.864611 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.864657 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkqt\" (UniqueName: \"kubernetes.io/projected/22516266-aa62-413f-bf35-5aa0add4af9e-kube-api-access-nqkqt\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.864669 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/22516266-aa62-413f-bf35-5aa0add4af9e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.864679 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22516266-aa62-413f-bf35-5aa0add4af9e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:32 crc kubenswrapper[4921]: I0318 13:54:32.864687 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22516266-aa62-413f-bf35-5aa0add4af9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.288368 4921 generic.go:334] "Generic (PLEG): container finished" podID="22516266-aa62-413f-bf35-5aa0add4af9e" containerID="363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f" exitCode=137 Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.288448 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79659f955c-rqc2t" event={"ID":"22516266-aa62-413f-bf35-5aa0add4af9e","Type":"ContainerDied","Data":"363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f"} Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.288498 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79659f955c-rqc2t" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.288615 4921 scope.go:117] "RemoveContainer" containerID="c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.288592 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79659f955c-rqc2t" event={"ID":"22516266-aa62-413f-bf35-5aa0add4af9e","Type":"ContainerDied","Data":"57185841c3db8165f6295045170c01c3c8b62dc47ddb4486d9b953321d4876f4"} Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.321720 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79659f955c-rqc2t"] Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.334629 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79659f955c-rqc2t"] Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.465707 4921 scope.go:117] "RemoveContainer" containerID="363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.483067 4921 scope.go:117] "RemoveContainer" containerID="c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4" Mar 18 13:54:33 crc kubenswrapper[4921]: E0318 13:54:33.483588 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4\": container with ID starting with c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4 not found: ID does not exist" containerID="c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.483625 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4"} err="failed to get container status \"c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4\": rpc error: code = NotFound desc = could not find container \"c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4\": container with ID starting with c86c6d3c51f93a5daa2c8a20296b38a5242703c4ee44e36e967b691ead5e8aa4 not found: ID does not exist" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.483647 4921 scope.go:117] "RemoveContainer" containerID="363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f" Mar 18 13:54:33 crc kubenswrapper[4921]: E0318 13:54:33.484012 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f\": container with ID starting with 363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f not found: ID does not exist" containerID="363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f" Mar 18 13:54:33 crc kubenswrapper[4921]: I0318 13:54:33.484038 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f"} err="failed to get container status \"363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f\": rpc error: code = NotFound desc = could not find container \"363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f\": container with ID starting with 363821e84c0dedc39ecc25cbdc132001c4fd335ce6f4356800d95bc0b937dd6f not found: ID does not exist" Mar 18 13:54:35 crc kubenswrapper[4921]: I0318 13:54:35.223808 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" path="/var/lib/kubelet/pods/22516266-aa62-413f-bf35-5aa0add4af9e/volumes" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.137059 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7"] Mar 18 13:54:45 crc kubenswrapper[4921]: E0318 13:54:45.138017 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon-log" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138033 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon-log" Mar 18 13:54:45 crc kubenswrapper[4921]: E0318 13:54:45.138042 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="pull" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138048 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="pull" Mar 18 13:54:45 crc kubenswrapper[4921]: E0318 13:54:45.138077 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138088 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" Mar 18 13:54:45 crc kubenswrapper[4921]: E0318 13:54:45.138101 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="util" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138124 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="util" Mar 18 13:54:45 crc kubenswrapper[4921]: E0318 13:54:45.138141 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="extract" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138148 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="extract" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138362 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f14ebc50-775b-49df-85a7-f3e9c736b149" containerName="extract" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138382 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon-log" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.138395 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="22516266-aa62-413f-bf35-5aa0add4af9e" containerName="horizon" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.139210 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.141263 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.141451 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-h2tzn" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.141603 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.149566 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7"] Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.326077 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j7s8\" (UniqueName: \"kubernetes.io/projected/c975b167-f2f3-4d74-9774-8f1734dac995-kube-api-access-6j7s8\") pod \"obo-prometheus-operator-8ff7d675-qhmx7\" (UID: \"c975b167-f2f3-4d74-9774-8f1734dac995\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.427963 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j7s8\" (UniqueName: \"kubernetes.io/projected/c975b167-f2f3-4d74-9774-8f1734dac995-kube-api-access-6j7s8\") pod \"obo-prometheus-operator-8ff7d675-qhmx7\" (UID: \"c975b167-f2f3-4d74-9774-8f1734dac995\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.456868 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j7s8\" (UniqueName: \"kubernetes.io/projected/c975b167-f2f3-4d74-9774-8f1734dac995-kube-api-access-6j7s8\") pod \"obo-prometheus-operator-8ff7d675-qhmx7\" (UID: \"c975b167-f2f3-4d74-9774-8f1734dac995\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.521784 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz"] Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.523176 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.525318 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.532728 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ng6w8" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.545783 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz"] Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.631712 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-28rgz\" (UID: \"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.631758 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-28rgz\" (UID: \"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.666436 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x"] Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.668041 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.680176 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x"] Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.733273 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-28rgz\" (UID: \"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.733332 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-28rgz\" (UID: \"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.755040 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-28rgz\" (UID: \"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.756988 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-28rgz\" (UID: \"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.757133 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.834947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/545fdd49-888f-466d-a528-ed4c2a57ce42-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x\" (UID: \"545fdd49-888f-466d-a528-ed4c2a57ce42\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.835361 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/545fdd49-888f-466d-a528-ed4c2a57ce42-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x\" (UID: \"545fdd49-888f-466d-a528-ed4c2a57ce42\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.840753 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.936881 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/545fdd49-888f-466d-a528-ed4c2a57ce42-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x\" (UID: \"545fdd49-888f-466d-a528-ed4c2a57ce42\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.936967 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/545fdd49-888f-466d-a528-ed4c2a57ce42-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x\" (UID: \"545fdd49-888f-466d-a528-ed4c2a57ce42\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.942774 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/545fdd49-888f-466d-a528-ed4c2a57ce42-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x\" (UID: \"545fdd49-888f-466d-a528-ed4c2a57ce42\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:45 crc kubenswrapper[4921]: I0318 13:54:45.943428 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/545fdd49-888f-466d-a528-ed4c2a57ce42-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x\" (UID: \"545fdd49-888f-466d-a528-ed4c2a57ce42\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.022804 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.067397 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-bbtjj"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.070265 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.075099 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.081874 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-bbtjj"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.083926 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-fbt5p" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.243420 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpzg\" (UniqueName: \"kubernetes.io/projected/7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f-kube-api-access-8fpzg\") pod \"observability-operator-6dd7dd855f-bbtjj\" (UID: \"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.243523 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-bbtjj\" (UID: \"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.349223 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpzg\" (UniqueName: \"kubernetes.io/projected/7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f-kube-api-access-8fpzg\") pod \"observability-operator-6dd7dd855f-bbtjj\" (UID: \"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.349314 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-bbtjj\" (UID: \"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.355033 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-bbtjj\" (UID: \"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.381011 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpzg\" (UniqueName: \"kubernetes.io/projected/7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f-kube-api-access-8fpzg\") pod \"observability-operator-6dd7dd855f-bbtjj\" (UID: \"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.412729 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.420319 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:46 crc kubenswrapper[4921]: W0318 13:54:46.425404 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc975b167_f2f3_4d74_9774_8f1734dac995.slice/crio-7465b5c735cff6dfc9f0869120eddc08122c1ffc8da9153f7a77d45db06bbcc2 WatchSource:0}: Error finding container 7465b5c735cff6dfc9f0869120eddc08122c1ffc8da9153f7a77d45db06bbcc2: Status 404 returned error can't find the container with id 7465b5c735cff6dfc9f0869120eddc08122c1ffc8da9153f7a77d45db06bbcc2 Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.450383 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" event={"ID":"c975b167-f2f3-4d74-9774-8f1734dac995","Type":"ContainerStarted","Data":"7465b5c735cff6dfc9f0869120eddc08122c1ffc8da9153f7a77d45db06bbcc2"} Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.507128 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-59cbb5b9bc-jz6kn"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.508818 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.514823 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.515492 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-clmgq" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.542606 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-59cbb5b9bc-jz6kn"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.627684 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.659932 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bjk\" (UniqueName: \"kubernetes.io/projected/0b764acb-5fd6-4b33-b653-597e0b72d927-kube-api-access-48bjk\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.660104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b764acb-5fd6-4b33-b653-597e0b72d927-apiservice-cert\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.660179 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b764acb-5fd6-4b33-b653-597e0b72d927-webhook-cert\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.660259 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b764acb-5fd6-4b33-b653-597e0b72d927-openshift-service-ca\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.764337 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bjk\" (UniqueName: \"kubernetes.io/projected/0b764acb-5fd6-4b33-b653-597e0b72d927-kube-api-access-48bjk\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.764435 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b764acb-5fd6-4b33-b653-597e0b72d927-apiservice-cert\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.764466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b764acb-5fd6-4b33-b653-597e0b72d927-webhook-cert\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.764512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b764acb-5fd6-4b33-b653-597e0b72d927-openshift-service-ca\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.765559 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b764acb-5fd6-4b33-b653-597e0b72d927-openshift-service-ca\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.772263 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b764acb-5fd6-4b33-b653-597e0b72d927-apiservice-cert\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.789725 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b764acb-5fd6-4b33-b653-597e0b72d927-webhook-cert\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.805505 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bjk\" (UniqueName: \"kubernetes.io/projected/0b764acb-5fd6-4b33-b653-597e0b72d927-kube-api-access-48bjk\") pod \"perses-operator-59cbb5b9bc-jz6kn\" (UID: \"0b764acb-5fd6-4b33-b653-597e0b72d927\") " pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.861006 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x"] Mar 18 13:54:46 crc kubenswrapper[4921]: I0318 13:54:46.874831 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.085846 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.086177 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.197456 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-bbtjj"] Mar 18 13:54:47 crc kubenswrapper[4921]: W0318 13:54:47.197801 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7de24d45_dec2_4515_b5fe_f3ad4bc7fd8f.slice/crio-438f50028a39b6106887c79c86220d38d0af5b64c921bb244af5f5554c16bd34 WatchSource:0}: Error finding container 438f50028a39b6106887c79c86220d38d0af5b64c921bb244af5f5554c16bd34: Status 404 returned error can't find the container with id 438f50028a39b6106887c79c86220d38d0af5b64c921bb244af5f5554c16bd34 Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.483730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" event={"ID":"545fdd49-888f-466d-a528-ed4c2a57ce42","Type":"ContainerStarted","Data":"f039938bb94247018d72816bf7a9c498dde7ed80c4ea883f01410f4dcb5490bf"} Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.496087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-59cbb5b9bc-jz6kn"] Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.503945 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" event={"ID":"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f","Type":"ContainerStarted","Data":"438f50028a39b6106887c79c86220d38d0af5b64c921bb244af5f5554c16bd34"} Mar 18 13:54:47 crc kubenswrapper[4921]: W0318 13:54:47.508123 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b764acb_5fd6_4b33_b653_597e0b72d927.slice/crio-217f2373d0a6eb724f9d9d0a8987d95670520c11c545b21275025fc1da6b09b3 WatchSource:0}: Error finding container 217f2373d0a6eb724f9d9d0a8987d95670520c11c545b21275025fc1da6b09b3: Status 404 returned error can't find the container with id 217f2373d0a6eb724f9d9d0a8987d95670520c11c545b21275025fc1da6b09b3 Mar 18 13:54:47 crc kubenswrapper[4921]: I0318 13:54:47.508771 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" event={"ID":"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8","Type":"ContainerStarted","Data":"9bcc7e3826ef43d7af559482efdfd5094419801464a32b2e15aad878bacbca34"} Mar 18 13:54:48 crc kubenswrapper[4921]: I0318 13:54:48.554719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" event={"ID":"0b764acb-5fd6-4b33-b653-597e0b72d927","Type":"ContainerStarted","Data":"217f2373d0a6eb724f9d9d0a8987d95670520c11c545b21275025fc1da6b09b3"} Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.700413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" event={"ID":"bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8","Type":"ContainerStarted","Data":"289e73b083dfb1dc48b66e39324bf235642d35f5dbe4620a2cf4e1a4eae5e42d"} Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.734100 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-28rgz" podStartSLOduration=2.276180425 podStartE2EDuration="11.734079956s" podCreationTimestamp="2026-03-18 13:54:45 +0000 UTC" firstStartedPulling="2026-03-18 13:54:46.651238561 +0000 UTC m=+6306.201159200" lastFinishedPulling="2026-03-18 13:54:56.109138092 +0000 UTC m=+6315.659058731" observedRunningTime="2026-03-18 13:54:56.729067232 +0000 UTC m=+6316.278987861" watchObservedRunningTime="2026-03-18 13:54:56.734079956 +0000 UTC m=+6316.284000595" Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.745144 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" event={"ID":"0b764acb-5fd6-4b33-b653-597e0b72d927","Type":"ContainerStarted","Data":"6f10951d6c74ce2ade2e6b66e02be7ce865996a0a9d1c50bd09c98fd5d753707"} Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.746144 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.757798 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" event={"ID":"7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f","Type":"ContainerStarted","Data":"2e129f00a0b94e4a5dc45c39956513cdc9c8c9f0b38498e1672c0cc94214ed3e"} Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.759185 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.760406 4921 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-bbtjj container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.159:8081/healthz\": dial tcp 10.217.1.159:8081: connect: connection refused" start-of-body= Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.760468 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" podUID="7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.159:8081/healthz\": dial tcp 10.217.1.159:8081: connect: connection refused" Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.786495 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" podStartSLOduration=2.577014689 podStartE2EDuration="11.786473007s" podCreationTimestamp="2026-03-18 13:54:45 +0000 UTC" firstStartedPulling="2026-03-18 13:54:46.899715875 +0000 UTC m=+6306.449636514" lastFinishedPulling="2026-03-18 13:54:56.109174193 +0000 UTC m=+6315.659094832" observedRunningTime="2026-03-18 13:54:56.777556802 +0000 UTC m=+6316.327477441" watchObservedRunningTime="2026-03-18 13:54:56.786473007 +0000 UTC m=+6316.336393646" Mar 18 13:54:56 crc kubenswrapper[4921]: I0318 13:54:56.834073 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" podStartSLOduration=2.236272797 podStartE2EDuration="10.834049091s" podCreationTimestamp="2026-03-18 13:54:46 +0000 UTC" firstStartedPulling="2026-03-18 13:54:47.514620101 +0000 UTC m=+6307.064540740" lastFinishedPulling="2026-03-18 13:54:56.112396395 +0000 UTC m=+6315.662317034" observedRunningTime="2026-03-18 13:54:56.818474945 +0000 UTC m=+6316.368395584" watchObservedRunningTime="2026-03-18 13:54:56.834049091 +0000 UTC m=+6316.383969740" Mar 18 13:54:57 crc kubenswrapper[4921]: I0318 13:54:57.773399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" event={"ID":"c975b167-f2f3-4d74-9774-8f1734dac995","Type":"ContainerStarted","Data":"2907ac16003206cc42d8848b707e369282a4eab0e9195857cde2bfef94207591"} Mar 18 13:54:57 crc kubenswrapper[4921]: I0318 13:54:57.775367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x" event={"ID":"545fdd49-888f-466d-a528-ed4c2a57ce42","Type":"ContainerStarted","Data":"7a3e3a61500f06c06887b51c6a6f545e8295c0efd64ca4cc8a86a02908887441"} Mar 18 13:54:57 crc kubenswrapper[4921]: I0318 13:54:57.777894 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" Mar 18 13:54:57 crc kubenswrapper[4921]: I0318 13:54:57.795538 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-qhmx7" podStartSLOduration=3.049351088 podStartE2EDuration="12.795516491s" podCreationTimestamp="2026-03-18 13:54:45 +0000 UTC" firstStartedPulling="2026-03-18 13:54:46.435816407 +0000 UTC m=+6305.985737056" lastFinishedPulling="2026-03-18 13:54:56.18198182 +0000 UTC m=+6315.731902459" observedRunningTime="2026-03-18 13:54:57.792649219 +0000 UTC m=+6317.342569868" watchObservedRunningTime="2026-03-18 13:54:57.795516491 +0000 UTC m=+6317.345437130" Mar 18 13:54:57 crc kubenswrapper[4921]: I0318 13:54:57.798703 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-bbtjj" podStartSLOduration=2.7581872670000003 podStartE2EDuration="11.798683952s" podCreationTimestamp="2026-03-18 13:54:46 +0000 UTC" firstStartedPulling="2026-03-18 13:54:47.199987692 +0000 UTC m=+6306.749908331" lastFinishedPulling="2026-03-18 13:54:56.240484377 +0000 UTC m=+6315.790405016" observedRunningTime="2026-03-18 13:54:56.878496794 +0000 UTC m=+6316.428417443" watchObservedRunningTime="2026-03-18 13:54:57.798683952 +0000 UTC m=+6317.348604591" Mar 18 13:55:00 crc kubenswrapper[4921]: I0318 13:55:00.040032 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vbtxw"] Mar 18 13:55:00 crc kubenswrapper[4921]: I0318 13:55:00.049148 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6k79j"] Mar 18 13:55:00 crc kubenswrapper[4921]: I0318 13:55:00.058205 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vbtxw"] Mar 18 13:55:00 crc kubenswrapper[4921]: I0318 13:55:00.067269 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6k79j"] Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.038752 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0efa-account-create-update-srvvf"] Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.050252 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0efa-account-create-update-srvvf"] Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.058218 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dp7ds"] Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.066064 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dp7ds"] Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.221920 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f" path="/var/lib/kubelet/pods/1ad6c5a7-b256-4df6-81a2-4cf2dd7fdb0f/volumes" Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.231854 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c218be7-6bd0-4e35-ad4c-4bc4aa635fae" path="/var/lib/kubelet/pods/2c218be7-6bd0-4e35-ad4c-4bc4aa635fae/volumes" Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.233007 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9a36da-9efd-4e81-bc24-925a880271da" path="/var/lib/kubelet/pods/6a9a36da-9efd-4e81-bc24-925a880271da/volumes" Mar 18 13:55:01 crc kubenswrapper[4921]: I0318 13:55:01.234736 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d136526d-eb62-4c8e-a26e-250933f5f0f4" path="/var/lib/kubelet/pods/d136526d-eb62-4c8e-a26e-250933f5f0f4/volumes" Mar 18 13:55:02 crc kubenswrapper[4921]: I0318 13:55:02.037509 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9453-account-create-update-d76hx"] Mar 18 13:55:02 crc kubenswrapper[4921]: I0318 13:55:02.047711 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-755e-account-create-update-bssdk"] Mar 18 13:55:02 crc kubenswrapper[4921]: I0318 13:55:02.057910 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9453-account-create-update-d76hx"] Mar 18 13:55:02 crc kubenswrapper[4921]: I0318 13:55:02.066808 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-755e-account-create-update-bssdk"] Mar 18 13:55:03 crc kubenswrapper[4921]: I0318 13:55:03.243536 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095fe9ee-f061-44f7-9c27-3a8700a1be60" path="/var/lib/kubelet/pods/095fe9ee-f061-44f7-9c27-3a8700a1be60/volumes" Mar 18 13:55:03 crc kubenswrapper[4921]: I0318 13:55:03.245099 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75fe9e8d-d964-497d-8956-6ab4b35e8c79" path="/var/lib/kubelet/pods/75fe9e8d-d964-497d-8956-6ab4b35e8c79/volumes" Mar 18 13:55:06 crc kubenswrapper[4921]: I0318 13:55:06.877444 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-59cbb5b9bc-jz6kn" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.359679 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.360815 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="58256fdc-88b6-4354-a86c-17dc1aebba44" containerName="openstackclient" containerID="cri-o://da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa" gracePeriod=2 Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.377205 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.429516 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 13:55:09 crc kubenswrapper[4921]: E0318 13:55:09.430006 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58256fdc-88b6-4354-a86c-17dc1aebba44" containerName="openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.430024 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="58256fdc-88b6-4354-a86c-17dc1aebba44" containerName="openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.430230 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="58256fdc-88b6-4354-a86c-17dc1aebba44" containerName="openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.431343 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.463294 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="58256fdc-88b6-4354-a86c-17dc1aebba44" podUID="ac1a56a3-581b-45b6-82f5-88216a7e74fd" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.469743 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.513542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac1a56a3-581b-45b6-82f5-88216a7e74fd-openstack-config-secret\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.513842 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4vk\" (UniqueName: \"kubernetes.io/projected/ac1a56a3-581b-45b6-82f5-88216a7e74fd-kube-api-access-lt4vk\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.513977 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac1a56a3-581b-45b6-82f5-88216a7e74fd-openstack-config\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.590050 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.592258 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.598322 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-g7xfj" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.616965 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac1a56a3-581b-45b6-82f5-88216a7e74fd-openstack-config\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.617182 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvb5\" (UniqueName: \"kubernetes.io/projected/86806a0f-c6ce-42ec-acde-3919d0c60a65-kube-api-access-4bvb5\") pod \"kube-state-metrics-0\" (UID: \"86806a0f-c6ce-42ec-acde-3919d0c60a65\") " pod="openstack/kube-state-metrics-0" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.617230 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac1a56a3-581b-45b6-82f5-88216a7e74fd-openstack-config-secret\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.617377 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4vk\" (UniqueName: \"kubernetes.io/projected/ac1a56a3-581b-45b6-82f5-88216a7e74fd-kube-api-access-lt4vk\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.618241 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ac1a56a3-581b-45b6-82f5-88216a7e74fd-openstack-config\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.633010 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.638684 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ac1a56a3-581b-45b6-82f5-88216a7e74fd-openstack-config-secret\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.688997 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4vk\" (UniqueName: \"kubernetes.io/projected/ac1a56a3-581b-45b6-82f5-88216a7e74fd-kube-api-access-lt4vk\") pod \"openstackclient\" (UID: \"ac1a56a3-581b-45b6-82f5-88216a7e74fd\") " pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.720144 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bvb5\" (UniqueName: \"kubernetes.io/projected/86806a0f-c6ce-42ec-acde-3919d0c60a65-kube-api-access-4bvb5\") pod \"kube-state-metrics-0\" (UID: \"86806a0f-c6ce-42ec-acde-3919d0c60a65\") " pod="openstack/kube-state-metrics-0" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.752168 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bvb5\" (UniqueName: \"kubernetes.io/projected/86806a0f-c6ce-42ec-acde-3919d0c60a65-kube-api-access-4bvb5\") pod \"kube-state-metrics-0\" (UID: \"86806a0f-c6ce-42ec-acde-3919d0c60a65\") " pod="openstack/kube-state-metrics-0" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.774339 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:55:09 crc kubenswrapper[4921]: I0318 13:55:09.910932 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.658799 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.661219 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.662360 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.662494 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.662603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjcz\" (UniqueName: \"kubernetes.io/projected/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-kube-api-access-xmjcz\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.662687 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.662878 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.663050 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.663187 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.667421 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.667640 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-z8khx" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.667791 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.667881 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.667978 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.749726 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776386 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776501 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776622 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776656 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776722 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjcz\" (UniqueName: \"kubernetes.io/projected/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-kube-api-access-xmjcz\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.776769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.785398 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.785897 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.805684 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.808604 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.811888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.816641 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.816706 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.828096 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjcz\" (UniqueName: \"kubernetes.io/projected/0f72ab46-f42c-4ec6-8f39-637fb5de6a9a-kube-api-access-xmjcz\") pod \"alertmanager-metric-storage-0\" (UID: \"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:10 crc kubenswrapper[4921]: I0318 13:55:10.941919 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.010258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ac1a56a3-581b-45b6-82f5-88216a7e74fd","Type":"ContainerStarted","Data":"ab8a6a0649d03768c775f4c3f8c95960a2d5f86e07f7f1e7e32391c5e3955abd"} Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.011201 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.027057 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86806a0f-c6ce-42ec-acde-3919d0c60a65","Type":"ContainerStarted","Data":"91d9aa1d7d34794f13881bd4ed082bf03a38a4d71995af2d5083a92911af163d"} Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.123011 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-smltf"] Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.134813 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-smltf"] Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.228790 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fe554c-3d22-45c1-8f9f-b10b3d36ad36" path="/var/lib/kubelet/pods/b0fe554c-3d22-45c1-8f9f-b10b3d36ad36/volumes" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.685270 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.688052 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.691907 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kh9zs" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.692121 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.692240 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.692363 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.692507 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.692605 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.698201 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.708632 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.730721 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775410 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775499 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jh48\" (UniqueName: \"kubernetes.io/projected/9cd7bae7-174c-4207-979c-7883deaa29fb-kube-api-access-8jh48\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775527 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775564 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cd7bae7-174c-4207-979c-7883deaa29fb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775587 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775659 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775686 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775724 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.775747 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cd7bae7-174c-4207-979c-7883deaa29fb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.823285 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.877769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.878803 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.878984 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jh48\" (UniqueName: \"kubernetes.io/projected/9cd7bae7-174c-4207-979c-7883deaa29fb-kube-api-access-8jh48\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879009 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879050 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cd7bae7-174c-4207-979c-7883deaa29fb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879078 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879149 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879175 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879205 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.880145 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.879253 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.881237 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9cd7bae7-174c-4207-979c-7883deaa29fb-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.881348 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cd7bae7-174c-4207-979c-7883deaa29fb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.888557 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cd7bae7-174c-4207-979c-7883deaa29fb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.889207 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.894559 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.894614 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c23627637f5aefa5fc782735562290ffa1543535516754dda005b5664ea8e167/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.896634 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.901922 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cd7bae7-174c-4207-979c-7883deaa29fb-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.906860 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cd7bae7-174c-4207-979c-7883deaa29fb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.909470 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jh48\" (UniqueName: \"kubernetes.io/projected/9cd7bae7-174c-4207-979c-7883deaa29fb-kube-api-access-8jh48\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:11 crc kubenswrapper[4921]: I0318 13:55:11.984188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27a93a9d-9e8e-4f04-b4c3-d0178bfe78da\") pod \"prometheus-metric-storage-0\" (UID: \"9cd7bae7-174c-4207-979c-7883deaa29fb\") " pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.042342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.047354 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="58256fdc-88b6-4354-a86c-17dc1aebba44" podUID="ac1a56a3-581b-45b6-82f5-88216a7e74fd" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.051694 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ac1a56a3-581b-45b6-82f5-88216a7e74fd","Type":"ContainerStarted","Data":"59e185d05185882cfcc57c52741d4c1b850dc5a26ed6a7142c421a065b2d92b4"} Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.055713 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86806a0f-c6ce-42ec-acde-3919d0c60a65","Type":"ContainerStarted","Data":"8177f82e36d88ca6a0f66e40fc8e524a39f97935259fb7d25612d92c761d016e"} Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.056512 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.058365 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a","Type":"ContainerStarted","Data":"4eec60ec29c1bac73a6ebf9fafa9950d51a1a2d0734a91d1c3571ef762f6e358"} Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.062768 4921 generic.go:334] "Generic (PLEG): container finished" podID="58256fdc-88b6-4354-a86c-17dc1aebba44" containerID="da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa" exitCode=137 Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.062811 4921 scope.go:117] "RemoveContainer" containerID="da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.062929 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.083633 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.085029 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config-secret\") pod \"58256fdc-88b6-4354-a86c-17dc1aebba44\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.086803 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndpx7\" (UniqueName: \"kubernetes.io/projected/58256fdc-88b6-4354-a86c-17dc1aebba44-kube-api-access-ndpx7\") pod \"58256fdc-88b6-4354-a86c-17dc1aebba44\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.086833 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config\") pod \"58256fdc-88b6-4354-a86c-17dc1aebba44\" (UID: \"58256fdc-88b6-4354-a86c-17dc1aebba44\") " Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.097034 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="58256fdc-88b6-4354-a86c-17dc1aebba44" podUID="ac1a56a3-581b-45b6-82f5-88216a7e74fd" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.103462 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58256fdc-88b6-4354-a86c-17dc1aebba44-kube-api-access-ndpx7" (OuterVolumeSpecName: "kube-api-access-ndpx7") pod "58256fdc-88b6-4354-a86c-17dc1aebba44" (UID: "58256fdc-88b6-4354-a86c-17dc1aebba44"). InnerVolumeSpecName "kube-api-access-ndpx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.104141 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.104086952 podStartE2EDuration="3.104086952s" podCreationTimestamp="2026-03-18 13:55:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:55:12.06601374 +0000 UTC m=+6331.615934379" watchObservedRunningTime="2026-03-18 13:55:12.104086952 +0000 UTC m=+6331.654007581" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.108702 4921 scope.go:117] "RemoveContainer" containerID="da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa" Mar 18 13:55:12 crc kubenswrapper[4921]: E0318 13:55:12.109610 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa\": container with ID starting with da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa not found: ID does not exist" containerID="da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.109681 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa"} err="failed to get container status \"da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa\": rpc error: code = NotFound desc = could not find container \"da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa\": container with ID starting with da839d3f6db8639bf73845a97227a6599ee9350b78b073a06e354b3d0f19ffaa not found: ID does not exist" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.112165 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.657949494 podStartE2EDuration="3.112089551s" podCreationTimestamp="2026-03-18 13:55:09 +0000 UTC" firstStartedPulling="2026-03-18 13:55:10.871445489 +0000 UTC m=+6330.421366128" lastFinishedPulling="2026-03-18 13:55:11.325585546 +0000 UTC m=+6330.875506185" observedRunningTime="2026-03-18 13:55:12.091799599 +0000 UTC m=+6331.641720238" watchObservedRunningTime="2026-03-18 13:55:12.112089551 +0000 UTC m=+6331.662010190" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.118368 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "58256fdc-88b6-4354-a86c-17dc1aebba44" (UID: "58256fdc-88b6-4354-a86c-17dc1aebba44"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.190295 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndpx7\" (UniqueName: \"kubernetes.io/projected/58256fdc-88b6-4354-a86c-17dc1aebba44-kube-api-access-ndpx7\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.190325 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.201225 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "58256fdc-88b6-4354-a86c-17dc1aebba44" (UID: "58256fdc-88b6-4354-a86c-17dc1aebba44"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.292287 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58256fdc-88b6-4354-a86c-17dc1aebba44-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.394536 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="58256fdc-88b6-4354-a86c-17dc1aebba44" podUID="ac1a56a3-581b-45b6-82f5-88216a7e74fd" Mar 18 13:55:12 crc kubenswrapper[4921]: W0318 13:55:12.666254 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cd7bae7_174c_4207_979c_7883deaa29fb.slice/crio-fd6d6f3cf4f18f1aed50aa2d3a5b78029795e76aa712e944ab760acd70206685 WatchSource:0}: Error finding container fd6d6f3cf4f18f1aed50aa2d3a5b78029795e76aa712e944ab760acd70206685: Status 404 returned error can't find the container with id fd6d6f3cf4f18f1aed50aa2d3a5b78029795e76aa712e944ab760acd70206685 Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.667547 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.739990 4921 scope.go:117] "RemoveContainer" containerID="7eb96d68e7e625f94612a5a361d373fd32b7ed7dc9e46b2609b473e18c645388" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.783225 4921 scope.go:117] "RemoveContainer" containerID="4175a6147f35c76e976564f86ceadd9ee21103301f94f37f9ddc12e8dbc54b2f" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.825286 4921 scope.go:117] "RemoveContainer" containerID="b6f99e564045e1757bd4ac3e9496ecc2001dddd60ef44395a4a8474715cf4c31" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.887476 4921 scope.go:117] "RemoveContainer" containerID="60b7b6e0491b94b05ff7418c3c7541c3a304585fce61af6c2aac8f77125ac3f9" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.926996 4921 scope.go:117] "RemoveContainer" containerID="deb5d22713ef5c5c4763ce0b3d2b60a1b681ca478566596c6af362549c7a984b" Mar 18 13:55:12 crc kubenswrapper[4921]: I0318 13:55:12.971621 4921 scope.go:117] "RemoveContainer" containerID="32676dd38ff52b4419048836061dfe80ae96dd1f80849f1da8dfed633da0fafc" Mar 18 13:55:13 crc kubenswrapper[4921]: I0318 13:55:13.000839 4921 scope.go:117] "RemoveContainer" containerID="f50a155c80d235574493bae686b8564dfc263bb9683eacb2dccaa598e220a47d" Mar 18 13:55:13 crc kubenswrapper[4921]: I0318 13:55:13.091913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cd7bae7-174c-4207-979c-7883deaa29fb","Type":"ContainerStarted","Data":"fd6d6f3cf4f18f1aed50aa2d3a5b78029795e76aa712e944ab760acd70206685"} Mar 18 13:55:13 crc kubenswrapper[4921]: I0318 13:55:13.220984 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58256fdc-88b6-4354-a86c-17dc1aebba44" path="/var/lib/kubelet/pods/58256fdc-88b6-4354-a86c-17dc1aebba44/volumes" Mar 18 13:55:17 crc kubenswrapper[4921]: I0318 13:55:17.080788 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:55:17 crc kubenswrapper[4921]: I0318 13:55:17.081492 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:55:19 crc kubenswrapper[4921]: I0318 13:55:19.171142 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a","Type":"ContainerStarted","Data":"174457db9c699ef94343e612aee77d5b7f6007036bafb609df3938ba4f8385cc"} Mar 18 13:55:19 crc kubenswrapper[4921]: I0318 13:55:19.174066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cd7bae7-174c-4207-979c-7883deaa29fb","Type":"ContainerStarted","Data":"b5ae44297ccca49eb243cfca63dbb9828e1c4d43ff3f547549bb17030a207bc1"} Mar 18 13:55:19 crc kubenswrapper[4921]: I0318 13:55:19.916558 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 13:55:24 crc kubenswrapper[4921]: I0318 13:55:24.229267 4921 generic.go:334] "Generic (PLEG): container finished" podID="0f72ab46-f42c-4ec6-8f39-637fb5de6a9a" containerID="174457db9c699ef94343e612aee77d5b7f6007036bafb609df3938ba4f8385cc" exitCode=0 Mar 18 13:55:24 crc kubenswrapper[4921]: I0318 13:55:24.229491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a","Type":"ContainerDied","Data":"174457db9c699ef94343e612aee77d5b7f6007036bafb609df3938ba4f8385cc"} Mar 18 13:55:25 crc kubenswrapper[4921]: I0318 13:55:25.042189 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkfpg"] Mar 18 13:55:25 crc kubenswrapper[4921]: I0318 13:55:25.050899 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkfpg"] Mar 18 13:55:25 crc kubenswrapper[4921]: I0318 13:55:25.221049 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561064ed-df57-419d-8d92-3837da58564b" path="/var/lib/kubelet/pods/561064ed-df57-419d-8d92-3837da58564b/volumes" Mar 18 13:55:25 crc kubenswrapper[4921]: I0318 13:55:25.241233 4921 generic.go:334] "Generic (PLEG): container finished" podID="9cd7bae7-174c-4207-979c-7883deaa29fb" containerID="b5ae44297ccca49eb243cfca63dbb9828e1c4d43ff3f547549bb17030a207bc1" exitCode=0 Mar 18 13:55:25 crc kubenswrapper[4921]: I0318 13:55:25.241318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cd7bae7-174c-4207-979c-7883deaa29fb","Type":"ContainerDied","Data":"b5ae44297ccca49eb243cfca63dbb9828e1c4d43ff3f547549bb17030a207bc1"} Mar 18 13:55:26 crc kubenswrapper[4921]: I0318 13:55:26.034982 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xgv99"] Mar 18 13:55:26 crc kubenswrapper[4921]: I0318 13:55:26.045338 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xgv99"] Mar 18 13:55:27 crc kubenswrapper[4921]: I0318 13:55:27.223137 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83fb77e2-32ca-40d1-a597-47c8ebe2a41a" path="/var/lib/kubelet/pods/83fb77e2-32ca-40d1-a597-47c8ebe2a41a/volumes" Mar 18 13:55:28 crc kubenswrapper[4921]: I0318 13:55:28.284830 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a","Type":"ContainerStarted","Data":"5be0b903a3107874cf68a8269956211929c20f2e15d51249224a9df2e56f79b7"} Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.475988 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b4snj"] Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.478772 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.491072 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4snj"] Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.599296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vs5\" (UniqueName: \"kubernetes.io/projected/41f644b2-cfb4-41de-bbbd-5d48032f088b-kube-api-access-44vs5\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.599357 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-catalog-content\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.599429 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-utilities\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.701306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vs5\" (UniqueName: \"kubernetes.io/projected/41f644b2-cfb4-41de-bbbd-5d48032f088b-kube-api-access-44vs5\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.701372 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-catalog-content\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.701427 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-utilities\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.702079 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-utilities\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.707007 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-catalog-content\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.725340 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vs5\" (UniqueName: \"kubernetes.io/projected/41f644b2-cfb4-41de-bbbd-5d48032f088b-kube-api-access-44vs5\") pod \"certified-operators-b4snj\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:30 crc kubenswrapper[4921]: I0318 13:55:30.802393 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:32 crc kubenswrapper[4921]: I0318 13:55:32.330880 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0f72ab46-f42c-4ec6-8f39-637fb5de6a9a","Type":"ContainerStarted","Data":"2da5871699ede12726bbd1c961c1ac66d36e7ecab4e798f1a050b0d1dc1c0345"} Mar 18 13:55:32 crc kubenswrapper[4921]: I0318 13:55:32.331481 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:32 crc kubenswrapper[4921]: I0318 13:55:32.334861 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 18 13:55:32 crc kubenswrapper[4921]: I0318 13:55:32.356410 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.861871202 podStartE2EDuration="22.356389578s" podCreationTimestamp="2026-03-18 13:55:10 +0000 UTC" firstStartedPulling="2026-03-18 13:55:11.834643698 +0000 UTC m=+6331.384564337" lastFinishedPulling="2026-03-18 13:55:27.329162074 +0000 UTC m=+6346.879082713" observedRunningTime="2026-03-18 13:55:32.350265683 +0000 UTC m=+6351.900186322" watchObservedRunningTime="2026-03-18 13:55:32.356389578 +0000 UTC m=+6351.906310217" Mar 18 13:55:33 crc kubenswrapper[4921]: I0318 13:55:33.075205 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b4snj"] Mar 18 13:55:33 crc kubenswrapper[4921]: I0318 13:55:33.342086 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cd7bae7-174c-4207-979c-7883deaa29fb","Type":"ContainerStarted","Data":"7938ce0c22bb8e362dcae2ef4b6924cd4827cda385ba4f08bc8dfb388f422a4e"} Mar 18 13:55:33 crc kubenswrapper[4921]: I0318 13:55:33.344047 4921 generic.go:334] "Generic (PLEG): container finished" podID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerID="faec84f356d8fc06ef42e47becd11046ef8280edac0208e7f9bcef1e2f5784c3" exitCode=0 Mar 18 13:55:33 crc kubenswrapper[4921]: I0318 13:55:33.344224 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerDied","Data":"faec84f356d8fc06ef42e47becd11046ef8280edac0208e7f9bcef1e2f5784c3"} Mar 18 13:55:33 crc kubenswrapper[4921]: I0318 13:55:33.344295 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerStarted","Data":"e19ff3e4e60276a09397db09a93cdd1b187e5fad8af83d03ad56c0d338612faf"} Mar 18 13:55:34 crc kubenswrapper[4921]: I0318 13:55:34.356301 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerStarted","Data":"f01408b04bbfa1005440a3d31191ce0a848c63f8bc43f8558fba54883e1e4c57"} Mar 18 13:55:36 crc kubenswrapper[4921]: I0318 13:55:36.378035 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cd7bae7-174c-4207-979c-7883deaa29fb","Type":"ContainerStarted","Data":"48000d07465dbf3df35f62359cf3ab648bf89b5a2592e4fa4732274f8e2c2b85"} Mar 18 13:55:36 crc kubenswrapper[4921]: I0318 13:55:36.381073 4921 generic.go:334] "Generic (PLEG): container finished" podID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerID="f01408b04bbfa1005440a3d31191ce0a848c63f8bc43f8558fba54883e1e4c57" exitCode=0 Mar 18 13:55:36 crc kubenswrapper[4921]: I0318 13:55:36.381131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerDied","Data":"f01408b04bbfa1005440a3d31191ce0a848c63f8bc43f8558fba54883e1e4c57"} Mar 18 13:55:37 crc kubenswrapper[4921]: I0318 13:55:37.394747 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerStarted","Data":"1ef7d05d45194bb1bd190e8c580b7d6b6fb37e058f1a30f6be6b246eedb02c67"} Mar 18 13:55:37 crc kubenswrapper[4921]: I0318 13:55:37.419724 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b4snj" podStartSLOduration=3.855760656 podStartE2EDuration="7.419703686s" podCreationTimestamp="2026-03-18 13:55:30 +0000 UTC" firstStartedPulling="2026-03-18 13:55:33.346268822 +0000 UTC m=+6352.896189501" lastFinishedPulling="2026-03-18 13:55:36.910211892 +0000 UTC m=+6356.460132531" observedRunningTime="2026-03-18 13:55:37.410102781 +0000 UTC m=+6356.960023420" watchObservedRunningTime="2026-03-18 13:55:37.419703686 +0000 UTC m=+6356.969624325" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.414267 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ldkf5"] Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.416938 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.429097 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ldkf5"] Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.601920 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-utilities\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.601983 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bl2m\" (UniqueName: \"kubernetes.io/projected/c5a6b043-388f-4a9e-9a21-ec65235abf20-kube-api-access-7bl2m\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.602352 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-catalog-content\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.705476 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-utilities\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.705866 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bl2m\" (UniqueName: \"kubernetes.io/projected/c5a6b043-388f-4a9e-9a21-ec65235abf20-kube-api-access-7bl2m\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.705923 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-catalog-content\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.706502 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-catalog-content\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.706583 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-utilities\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.740211 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bl2m\" (UniqueName: \"kubernetes.io/projected/c5a6b043-388f-4a9e-9a21-ec65235abf20-kube-api-access-7bl2m\") pod \"community-operators-ldkf5\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:39 crc kubenswrapper[4921]: I0318 13:55:39.747884 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.059073 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xplfh"] Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.076436 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xplfh"] Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.321130 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ldkf5"] Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.453128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerStarted","Data":"a066688d7a3207d5062633c5239e9a133747c4e4629ef0697a16db3db09d2073"} Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.458868 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cd7bae7-174c-4207-979c-7883deaa29fb","Type":"ContainerStarted","Data":"fc13c5854167fb964c1b9e7d357e4638c88f72181172da8cdd9d7d247c68d900"} Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.485761 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.460710958 podStartE2EDuration="30.485742543s" podCreationTimestamp="2026-03-18 13:55:10 +0000 UTC" firstStartedPulling="2026-03-18 13:55:12.66917426 +0000 UTC m=+6332.219094899" lastFinishedPulling="2026-03-18 13:55:39.694205845 +0000 UTC m=+6359.244126484" observedRunningTime="2026-03-18 13:55:40.482596443 +0000 UTC m=+6360.032517082" watchObservedRunningTime="2026-03-18 13:55:40.485742543 +0000 UTC m=+6360.035663192" Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.802699 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:40 crc kubenswrapper[4921]: I0318 13:55:40.802755 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:41 crc kubenswrapper[4921]: I0318 13:55:41.225184 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645f51e9-2ec5-4a14-80fd-0cdab30d1675" path="/var/lib/kubelet/pods/645f51e9-2ec5-4a14-80fd-0cdab30d1675/volumes" Mar 18 13:55:41 crc kubenswrapper[4921]: I0318 13:55:41.471242 4921 generic.go:334] "Generic (PLEG): container finished" podID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerID="10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9" exitCode=0 Mar 18 13:55:41 crc kubenswrapper[4921]: I0318 13:55:41.471294 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerDied","Data":"10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9"} Mar 18 13:55:41 crc kubenswrapper[4921]: I0318 13:55:41.850269 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b4snj" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="registry-server" probeResult="failure" output=< Mar 18 13:55:41 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 13:55:41 crc kubenswrapper[4921]: > Mar 18 13:55:42 crc kubenswrapper[4921]: I0318 13:55:42.086499 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:42 crc kubenswrapper[4921]: I0318 13:55:42.086635 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:42 crc kubenswrapper[4921]: I0318 13:55:42.092387 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:42 crc kubenswrapper[4921]: I0318 13:55:42.482502 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerStarted","Data":"0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa"} Mar 18 13:55:42 crc kubenswrapper[4921]: I0318 13:55:42.484096 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.589995 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.617404 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.621446 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.621967 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.652326 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.709678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-log-httpd\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.709744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-run-httpd\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.709892 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.709966 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.710008 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-config-data\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.710068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsv8j\" (UniqueName: \"kubernetes.io/projected/cea7f817-98f3-42d6-a9ee-73183d043599-kube-api-access-jsv8j\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.710149 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-scripts\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811591 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsv8j\" (UniqueName: \"kubernetes.io/projected/cea7f817-98f3-42d6-a9ee-73183d043599-kube-api-access-jsv8j\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811655 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-scripts\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811793 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-log-httpd\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811855 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-run-httpd\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811925 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811964 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.811992 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-config-data\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.812564 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-run-httpd\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.812630 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-log-httpd\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.821256 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.822190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-scripts\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.823473 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.825040 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-config-data\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.831029 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsv8j\" (UniqueName: \"kubernetes.io/projected/cea7f817-98f3-42d6-a9ee-73183d043599-kube-api-access-jsv8j\") pod \"ceilometer-0\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " pod="openstack/ceilometer-0" Mar 18 13:55:43 crc kubenswrapper[4921]: I0318 13:55:43.951468 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:55:44 crc kubenswrapper[4921]: I0318 13:55:44.507363 4921 generic.go:334] "Generic (PLEG): container finished" podID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerID="0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa" exitCode=0 Mar 18 13:55:44 crc kubenswrapper[4921]: I0318 13:55:44.507431 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerDied","Data":"0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa"} Mar 18 13:55:44 crc kubenswrapper[4921]: I0318 13:55:44.724507 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:55:44 crc kubenswrapper[4921]: W0318 13:55:44.740434 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea7f817_98f3_42d6_a9ee_73183d043599.slice/crio-ee728694a484c1fd89ffd5c8d1de632690c3d59e4f8ecd4e4b7bdf083f7d6f82 WatchSource:0}: Error finding container ee728694a484c1fd89ffd5c8d1de632690c3d59e4f8ecd4e4b7bdf083f7d6f82: Status 404 returned error can't find the container with id ee728694a484c1fd89ffd5c8d1de632690c3d59e4f8ecd4e4b7bdf083f7d6f82 Mar 18 13:55:45 crc kubenswrapper[4921]: I0318 13:55:45.519809 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerStarted","Data":"6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d"} Mar 18 13:55:45 crc kubenswrapper[4921]: I0318 13:55:45.521572 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerStarted","Data":"acf1ebbb0f74c1f0254b7bce7af00d6a9886f37720583481d140271cea8c4522"} Mar 18 13:55:45 crc kubenswrapper[4921]: I0318 13:55:45.521601 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerStarted","Data":"ee728694a484c1fd89ffd5c8d1de632690c3d59e4f8ecd4e4b7bdf083f7d6f82"} Mar 18 13:55:45 crc kubenswrapper[4921]: I0318 13:55:45.542519 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ldkf5" podStartSLOduration=3.038600985 podStartE2EDuration="6.542501924s" podCreationTimestamp="2026-03-18 13:55:39 +0000 UTC" firstStartedPulling="2026-03-18 13:55:41.47430509 +0000 UTC m=+6361.024225729" lastFinishedPulling="2026-03-18 13:55:44.978206029 +0000 UTC m=+6364.528126668" observedRunningTime="2026-03-18 13:55:45.540329512 +0000 UTC m=+6365.090250151" watchObservedRunningTime="2026-03-18 13:55:45.542501924 +0000 UTC m=+6365.092422563" Mar 18 13:55:46 crc kubenswrapper[4921]: I0318 13:55:46.534628 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerStarted","Data":"e9f8ba56ac3a6ecb7cbcf33f0b6f75b254142fa773ec6e6637b19dad339ecf22"} Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.081312 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.081681 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.081733 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.082605 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.082681 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" gracePeriod=600 Mar 18 13:55:47 crc kubenswrapper[4921]: E0318 13:55:47.214770 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.560055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerStarted","Data":"92acc74489094357239191794278d92047fbfd8763622cbb29e69f48d8a49bf1"} Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.563162 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" exitCode=0 Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.563198 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b"} Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.563287 4921 scope.go:117] "RemoveContainer" containerID="038dd07b701a6099240a4d7ebb53358644804af29a553533cae1d5837f2d8758" Mar 18 13:55:47 crc kubenswrapper[4921]: I0318 13:55:47.564055 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:55:47 crc kubenswrapper[4921]: E0318 13:55:47.564527 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.432438 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w7cwk"] Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.440588 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.457021 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7cwk"] Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.540504 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-utilities\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.540767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8tc\" (UniqueName: \"kubernetes.io/projected/082ed6d1-fe58-4faf-9b82-7c5292191511-kube-api-access-7f8tc\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.540829 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-catalog-content\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.643157 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8tc\" (UniqueName: \"kubernetes.io/projected/082ed6d1-fe58-4faf-9b82-7c5292191511-kube-api-access-7f8tc\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.643223 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-catalog-content\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.643283 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-utilities\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.644198 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-utilities\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.644443 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-catalog-content\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.664939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8tc\" (UniqueName: \"kubernetes.io/projected/082ed6d1-fe58-4faf-9b82-7c5292191511-kube-api-access-7f8tc\") pod \"redhat-marketplace-w7cwk\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.748611 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.748881 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:49 crc kubenswrapper[4921]: I0318 13:55:49.939836 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.548465 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7cwk"] Mar 18 13:55:50 crc kubenswrapper[4921]: W0318 13:55:50.563706 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod082ed6d1_fe58_4faf_9b82_7c5292191511.slice/crio-1cac98a192fb8ed293facc715707a72d2bdc52f79c61cba1abd10b8576668e13 WatchSource:0}: Error finding container 1cac98a192fb8ed293facc715707a72d2bdc52f79c61cba1abd10b8576668e13: Status 404 returned error can't find the container with id 1cac98a192fb8ed293facc715707a72d2bdc52f79c61cba1abd10b8576668e13 Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.594758 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerStarted","Data":"b92a6af7b4f9ecfa2c96a60d6c1362cd557f76a4a811357d282d4bd7cd2e44fa"} Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.594904 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.597378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerStarted","Data":"1cac98a192fb8ed293facc715707a72d2bdc52f79c61cba1abd10b8576668e13"} Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.633070 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.945325341 podStartE2EDuration="7.633048463s" podCreationTimestamp="2026-03-18 13:55:43 +0000 UTC" firstStartedPulling="2026-03-18 13:55:44.743774609 +0000 UTC m=+6364.293695248" lastFinishedPulling="2026-03-18 13:55:49.431497731 +0000 UTC m=+6368.981418370" observedRunningTime="2026-03-18 13:55:50.622783548 +0000 UTC m=+6370.172704197" watchObservedRunningTime="2026-03-18 13:55:50.633048463 +0000 UTC m=+6370.182969092" Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.801370 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ldkf5" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="registry-server" probeResult="failure" output=< Mar 18 13:55:50 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 13:55:50 crc kubenswrapper[4921]: > Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.855301 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:50 crc kubenswrapper[4921]: I0318 13:55:50.917159 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:51 crc kubenswrapper[4921]: I0318 13:55:51.607679 4921 generic.go:334] "Generic (PLEG): container finished" podID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerID="300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35" exitCode=0 Mar 18 13:55:51 crc kubenswrapper[4921]: I0318 13:55:51.607745 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerDied","Data":"300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35"} Mar 18 13:55:51 crc kubenswrapper[4921]: I0318 13:55:51.610078 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:55:52 crc kubenswrapper[4921]: I0318 13:55:52.616916 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerStarted","Data":"bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc"} Mar 18 13:55:53 crc kubenswrapper[4921]: I0318 13:55:53.224707 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b4snj"] Mar 18 13:55:53 crc kubenswrapper[4921]: I0318 13:55:53.225446 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b4snj" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="registry-server" containerID="cri-o://1ef7d05d45194bb1bd190e8c580b7d6b6fb37e058f1a30f6be6b246eedb02c67" gracePeriod=2 Mar 18 13:55:53 crc kubenswrapper[4921]: I0318 13:55:53.630204 4921 generic.go:334] "Generic (PLEG): container finished" podID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerID="1ef7d05d45194bb1bd190e8c580b7d6b6fb37e058f1a30f6be6b246eedb02c67" exitCode=0 Mar 18 13:55:53 crc kubenswrapper[4921]: I0318 13:55:53.630254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerDied","Data":"1ef7d05d45194bb1bd190e8c580b7d6b6fb37e058f1a30f6be6b246eedb02c67"} Mar 18 13:55:53 crc kubenswrapper[4921]: I0318 13:55:53.905424 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.065836 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-utilities\") pod \"41f644b2-cfb4-41de-bbbd-5d48032f088b\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.065905 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-catalog-content\") pod \"41f644b2-cfb4-41de-bbbd-5d48032f088b\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.066179 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vs5\" (UniqueName: \"kubernetes.io/projected/41f644b2-cfb4-41de-bbbd-5d48032f088b-kube-api-access-44vs5\") pod \"41f644b2-cfb4-41de-bbbd-5d48032f088b\" (UID: \"41f644b2-cfb4-41de-bbbd-5d48032f088b\") " Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.066569 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-utilities" (OuterVolumeSpecName: "utilities") pod "41f644b2-cfb4-41de-bbbd-5d48032f088b" (UID: "41f644b2-cfb4-41de-bbbd-5d48032f088b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.089267 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f644b2-cfb4-41de-bbbd-5d48032f088b-kube-api-access-44vs5" (OuterVolumeSpecName: "kube-api-access-44vs5") pod "41f644b2-cfb4-41de-bbbd-5d48032f088b" (UID: "41f644b2-cfb4-41de-bbbd-5d48032f088b"). InnerVolumeSpecName "kube-api-access-44vs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.116621 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41f644b2-cfb4-41de-bbbd-5d48032f088b" (UID: "41f644b2-cfb4-41de-bbbd-5d48032f088b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.168837 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vs5\" (UniqueName: \"kubernetes.io/projected/41f644b2-cfb4-41de-bbbd-5d48032f088b-kube-api-access-44vs5\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.168874 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.168889 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41f644b2-cfb4-41de-bbbd-5d48032f088b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.664606 4921 generic.go:334] "Generic (PLEG): container finished" podID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerID="bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc" exitCode=0 Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.664750 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerDied","Data":"bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc"} Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.670424 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b4snj" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.670437 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b4snj" event={"ID":"41f644b2-cfb4-41de-bbbd-5d48032f088b","Type":"ContainerDied","Data":"e19ff3e4e60276a09397db09a93cdd1b187e5fad8af83d03ad56c0d338612faf"} Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.670490 4921 scope.go:117] "RemoveContainer" containerID="1ef7d05d45194bb1bd190e8c580b7d6b6fb37e058f1a30f6be6b246eedb02c67" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.701232 4921 scope.go:117] "RemoveContainer" containerID="f01408b04bbfa1005440a3d31191ce0a848c63f8bc43f8558fba54883e1e4c57" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.723044 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b4snj"] Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.727637 4921 scope.go:117] "RemoveContainer" containerID="faec84f356d8fc06ef42e47becd11046ef8280edac0208e7f9bcef1e2f5784c3" Mar 18 13:55:54 crc kubenswrapper[4921]: I0318 13:55:54.734783 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b4snj"] Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.223601 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" path="/var/lib/kubelet/pods/41f644b2-cfb4-41de-bbbd-5d48032f088b/volumes" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.387195 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-kb4pr"] Mar 18 13:55:55 crc kubenswrapper[4921]: E0318 13:55:55.387831 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="extract-utilities" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.387848 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="extract-utilities" Mar 18 13:55:55 crc kubenswrapper[4921]: E0318 13:55:55.387871 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="extract-content" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.387878 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="extract-content" Mar 18 13:55:55 crc kubenswrapper[4921]: E0318 13:55:55.387889 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="registry-server" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.387895 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="registry-server" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.388101 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f644b2-cfb4-41de-bbbd-5d48032f088b" containerName="registry-server" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.388847 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.402212 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kb4pr"] Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.493719 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvkp\" (UniqueName: \"kubernetes.io/projected/dab18c2d-33f1-49e1-a4cf-f0200d43474d-kube-api-access-qqvkp\") pod \"aodh-db-create-kb4pr\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.493796 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab18c2d-33f1-49e1-a4cf-f0200d43474d-operator-scripts\") pod \"aodh-db-create-kb4pr\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.493842 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-58d5-account-create-update-4zxmv"] Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.495600 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.500163 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.501955 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-58d5-account-create-update-4zxmv"] Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.595831 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvkp\" (UniqueName: \"kubernetes.io/projected/dab18c2d-33f1-49e1-a4cf-f0200d43474d-kube-api-access-qqvkp\") pod \"aodh-db-create-kb4pr\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.595908 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab18c2d-33f1-49e1-a4cf-f0200d43474d-operator-scripts\") pod \"aodh-db-create-kb4pr\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.595947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sq5s\" (UniqueName: \"kubernetes.io/projected/b1a395ac-364e-4900-a705-366bf21d4cff-kube-api-access-4sq5s\") pod \"aodh-58d5-account-create-update-4zxmv\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.595978 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a395ac-364e-4900-a705-366bf21d4cff-operator-scripts\") pod \"aodh-58d5-account-create-update-4zxmv\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.596903 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab18c2d-33f1-49e1-a4cf-f0200d43474d-operator-scripts\") pod \"aodh-db-create-kb4pr\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.617793 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvkp\" (UniqueName: \"kubernetes.io/projected/dab18c2d-33f1-49e1-a4cf-f0200d43474d-kube-api-access-qqvkp\") pod \"aodh-db-create-kb4pr\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.682858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerStarted","Data":"fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192"} Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.698686 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sq5s\" (UniqueName: \"kubernetes.io/projected/b1a395ac-364e-4900-a705-366bf21d4cff-kube-api-access-4sq5s\") pod \"aodh-58d5-account-create-update-4zxmv\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.698746 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a395ac-364e-4900-a705-366bf21d4cff-operator-scripts\") pod \"aodh-58d5-account-create-update-4zxmv\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.699765 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a395ac-364e-4900-a705-366bf21d4cff-operator-scripts\") pod \"aodh-58d5-account-create-update-4zxmv\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.708446 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.712485 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w7cwk" podStartSLOduration=3.198408404 podStartE2EDuration="6.712463953s" podCreationTimestamp="2026-03-18 13:55:49 +0000 UTC" firstStartedPulling="2026-03-18 13:55:51.609849883 +0000 UTC m=+6371.159770522" lastFinishedPulling="2026-03-18 13:55:55.123905432 +0000 UTC m=+6374.673826071" observedRunningTime="2026-03-18 13:55:55.705661938 +0000 UTC m=+6375.255582577" watchObservedRunningTime="2026-03-18 13:55:55.712463953 +0000 UTC m=+6375.262384592" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.723775 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sq5s\" (UniqueName: \"kubernetes.io/projected/b1a395ac-364e-4900-a705-366bf21d4cff-kube-api-access-4sq5s\") pod \"aodh-58d5-account-create-update-4zxmv\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:55 crc kubenswrapper[4921]: I0318 13:55:55.815453 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:56 crc kubenswrapper[4921]: I0318 13:55:56.384564 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kb4pr"] Mar 18 13:55:56 crc kubenswrapper[4921]: I0318 13:55:56.507905 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-58d5-account-create-update-4zxmv"] Mar 18 13:55:56 crc kubenswrapper[4921]: W0318 13:55:56.508464 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1a395ac_364e_4900_a705_366bf21d4cff.slice/crio-8c9e0635a89245599fdb7e0252ffa54e7482bf98ca5df16cb85b3451d893e7aa WatchSource:0}: Error finding container 8c9e0635a89245599fdb7e0252ffa54e7482bf98ca5df16cb85b3451d893e7aa: Status 404 returned error can't find the container with id 8c9e0635a89245599fdb7e0252ffa54e7482bf98ca5df16cb85b3451d893e7aa Mar 18 13:55:56 crc kubenswrapper[4921]: I0318 13:55:56.691087 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-58d5-account-create-update-4zxmv" event={"ID":"b1a395ac-364e-4900-a705-366bf21d4cff","Type":"ContainerStarted","Data":"8c9e0635a89245599fdb7e0252ffa54e7482bf98ca5df16cb85b3451d893e7aa"} Mar 18 13:55:56 crc kubenswrapper[4921]: I0318 13:55:56.692594 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kb4pr" event={"ID":"dab18c2d-33f1-49e1-a4cf-f0200d43474d","Type":"ContainerStarted","Data":"3a09dc3e5aa91c034c19d5040307fe817db63c62ede23577cb4123eafad8fea7"} Mar 18 13:55:57 crc kubenswrapper[4921]: I0318 13:55:57.723137 4921 generic.go:334] "Generic (PLEG): container finished" podID="b1a395ac-364e-4900-a705-366bf21d4cff" containerID="67a64396e681353d6daabf1d87f24147c70dcb7dc467543dc09c98729fb0c219" exitCode=0 Mar 18 13:55:57 crc kubenswrapper[4921]: I0318 13:55:57.723214 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-58d5-account-create-update-4zxmv" event={"ID":"b1a395ac-364e-4900-a705-366bf21d4cff","Type":"ContainerDied","Data":"67a64396e681353d6daabf1d87f24147c70dcb7dc467543dc09c98729fb0c219"} Mar 18 13:55:57 crc kubenswrapper[4921]: I0318 13:55:57.725968 4921 generic.go:334] "Generic (PLEG): container finished" podID="dab18c2d-33f1-49e1-a4cf-f0200d43474d" containerID="27e014aae387f6c4cd7700096ce984c894a100716b0d67dde6a719c528f104ab" exitCode=0 Mar 18 13:55:57 crc kubenswrapper[4921]: I0318 13:55:57.725997 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kb4pr" event={"ID":"dab18c2d-33f1-49e1-a4cf-f0200d43474d","Type":"ContainerDied","Data":"27e014aae387f6c4cd7700096ce984c894a100716b0d67dde6a719c528f104ab"} Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.219861 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.228706 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.299751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvkp\" (UniqueName: \"kubernetes.io/projected/dab18c2d-33f1-49e1-a4cf-f0200d43474d-kube-api-access-qqvkp\") pod \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.299897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sq5s\" (UniqueName: \"kubernetes.io/projected/b1a395ac-364e-4900-a705-366bf21d4cff-kube-api-access-4sq5s\") pod \"b1a395ac-364e-4900-a705-366bf21d4cff\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.299918 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a395ac-364e-4900-a705-366bf21d4cff-operator-scripts\") pod \"b1a395ac-364e-4900-a705-366bf21d4cff\" (UID: \"b1a395ac-364e-4900-a705-366bf21d4cff\") " Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.299998 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab18c2d-33f1-49e1-a4cf-f0200d43474d-operator-scripts\") pod \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\" (UID: \"dab18c2d-33f1-49e1-a4cf-f0200d43474d\") " Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.301400 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a395ac-364e-4900-a705-366bf21d4cff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1a395ac-364e-4900-a705-366bf21d4cff" (UID: "b1a395ac-364e-4900-a705-366bf21d4cff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.301806 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab18c2d-33f1-49e1-a4cf-f0200d43474d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dab18c2d-33f1-49e1-a4cf-f0200d43474d" (UID: "dab18c2d-33f1-49e1-a4cf-f0200d43474d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.305770 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a395ac-364e-4900-a705-366bf21d4cff-kube-api-access-4sq5s" (OuterVolumeSpecName: "kube-api-access-4sq5s") pod "b1a395ac-364e-4900-a705-366bf21d4cff" (UID: "b1a395ac-364e-4900-a705-366bf21d4cff"). InnerVolumeSpecName "kube-api-access-4sq5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.306360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab18c2d-33f1-49e1-a4cf-f0200d43474d-kube-api-access-qqvkp" (OuterVolumeSpecName: "kube-api-access-qqvkp") pod "dab18c2d-33f1-49e1-a4cf-f0200d43474d" (UID: "dab18c2d-33f1-49e1-a4cf-f0200d43474d"). InnerVolumeSpecName "kube-api-access-qqvkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.402125 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sq5s\" (UniqueName: \"kubernetes.io/projected/b1a395ac-364e-4900-a705-366bf21d4cff-kube-api-access-4sq5s\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.402448 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a395ac-364e-4900-a705-366bf21d4cff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.402458 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dab18c2d-33f1-49e1-a4cf-f0200d43474d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.402470 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvkp\" (UniqueName: \"kubernetes.io/projected/dab18c2d-33f1-49e1-a4cf-f0200d43474d-kube-api-access-qqvkp\") on node \"crc\" DevicePath \"\"" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.749858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-58d5-account-create-update-4zxmv" event={"ID":"b1a395ac-364e-4900-a705-366bf21d4cff","Type":"ContainerDied","Data":"8c9e0635a89245599fdb7e0252ffa54e7482bf98ca5df16cb85b3451d893e7aa"} Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.749920 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c9e0635a89245599fdb7e0252ffa54e7482bf98ca5df16cb85b3451d893e7aa" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.749921 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-58d5-account-create-update-4zxmv" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.752456 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kb4pr" event={"ID":"dab18c2d-33f1-49e1-a4cf-f0200d43474d","Type":"ContainerDied","Data":"3a09dc3e5aa91c034c19d5040307fe817db63c62ede23577cb4123eafad8fea7"} Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.752504 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kb4pr" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.752503 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a09dc3e5aa91c034c19d5040307fe817db63c62ede23577cb4123eafad8fea7" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.796191 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.844612 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.941598 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.941677 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:55:59 crc kubenswrapper[4921]: I0318 13:55:59.989778 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.039832 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ldkf5"] Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.136595 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564036-wfcm4"] Mar 18 13:56:00 crc kubenswrapper[4921]: E0318 13:56:00.137270 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a395ac-364e-4900-a705-366bf21d4cff" containerName="mariadb-account-create-update" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.137412 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a395ac-364e-4900-a705-366bf21d4cff" containerName="mariadb-account-create-update" Mar 18 13:56:00 crc kubenswrapper[4921]: E0318 13:56:00.137473 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab18c2d-33f1-49e1-a4cf-f0200d43474d" containerName="mariadb-database-create" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.137533 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab18c2d-33f1-49e1-a4cf-f0200d43474d" containerName="mariadb-database-create" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.137804 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a395ac-364e-4900-a705-366bf21d4cff" containerName="mariadb-account-create-update" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.137941 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab18c2d-33f1-49e1-a4cf-f0200d43474d" containerName="mariadb-database-create" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.138721 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.148493 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.148687 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.148844 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.152863 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-wfcm4"] Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.208972 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:56:00 crc kubenswrapper[4921]: E0318 13:56:00.209281 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.219763 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqcd4\" (UniqueName: \"kubernetes.io/projected/d18516be-a913-4d85-9d32-265cc891709a-kube-api-access-qqcd4\") pod \"auto-csr-approver-29564036-wfcm4\" (UID: \"d18516be-a913-4d85-9d32-265cc891709a\") " pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.321617 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqcd4\" (UniqueName: \"kubernetes.io/projected/d18516be-a913-4d85-9d32-265cc891709a-kube-api-access-qqcd4\") pod \"auto-csr-approver-29564036-wfcm4\" (UID: \"d18516be-a913-4d85-9d32-265cc891709a\") " pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.343435 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqcd4\" (UniqueName: \"kubernetes.io/projected/d18516be-a913-4d85-9d32-265cc891709a-kube-api-access-qqcd4\") pod \"auto-csr-approver-29564036-wfcm4\" (UID: \"d18516be-a913-4d85-9d32-265cc891709a\") " pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.467782 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.828319 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-w92g6"] Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.834628 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.835572 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.839474 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-w92g6"] Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.844662 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.844684 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.844708 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8tc9k" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.846197 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.937233 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-scripts\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.937311 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5zv6\" (UniqueName: \"kubernetes.io/projected/3ef32792-2603-4b63-8bf7-3413655270db-kube-api-access-x5zv6\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.937341 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-config-data\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:00 crc kubenswrapper[4921]: I0318 13:56:00.937406 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-combined-ca-bundle\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.039775 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-scripts\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.039842 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5zv6\" (UniqueName: \"kubernetes.io/projected/3ef32792-2603-4b63-8bf7-3413655270db-kube-api-access-x5zv6\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.039876 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-config-data\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.039928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-combined-ca-bundle\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.047837 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-scripts\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.048261 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-combined-ca-bundle\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.049203 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-config-data\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.053678 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-wfcm4"] Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.063357 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5zv6\" (UniqueName: \"kubernetes.io/projected/3ef32792-2603-4b63-8bf7-3413655270db-kube-api-access-x5zv6\") pod \"aodh-db-sync-w92g6\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.159577 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.690952 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-w92g6"] Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.770141 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-w92g6" event={"ID":"3ef32792-2603-4b63-8bf7-3413655270db","Type":"ContainerStarted","Data":"c1be0672bf90f30b77de4ee890d77e85f12c767219dc6d2448d85ca12880be88"} Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.772270 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" event={"ID":"d18516be-a913-4d85-9d32-265cc891709a","Type":"ContainerStarted","Data":"f4f5498d01cd51a428c4bc5305578a40781068e59316a8b9d12f7ed8590deda2"} Mar 18 13:56:01 crc kubenswrapper[4921]: I0318 13:56:01.772479 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ldkf5" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="registry-server" containerID="cri-o://6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d" gracePeriod=2 Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.256268 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7cwk"] Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.445436 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.575008 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-utilities\") pod \"c5a6b043-388f-4a9e-9a21-ec65235abf20\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.575057 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bl2m\" (UniqueName: \"kubernetes.io/projected/c5a6b043-388f-4a9e-9a21-ec65235abf20-kube-api-access-7bl2m\") pod \"c5a6b043-388f-4a9e-9a21-ec65235abf20\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.575177 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-catalog-content\") pod \"c5a6b043-388f-4a9e-9a21-ec65235abf20\" (UID: \"c5a6b043-388f-4a9e-9a21-ec65235abf20\") " Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.575732 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-utilities" (OuterVolumeSpecName: "utilities") pod "c5a6b043-388f-4a9e-9a21-ec65235abf20" (UID: "c5a6b043-388f-4a9e-9a21-ec65235abf20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.600709 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a6b043-388f-4a9e-9a21-ec65235abf20-kube-api-access-7bl2m" (OuterVolumeSpecName: "kube-api-access-7bl2m") pod "c5a6b043-388f-4a9e-9a21-ec65235abf20" (UID: "c5a6b043-388f-4a9e-9a21-ec65235abf20"). InnerVolumeSpecName "kube-api-access-7bl2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.642963 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5a6b043-388f-4a9e-9a21-ec65235abf20" (UID: "c5a6b043-388f-4a9e-9a21-ec65235abf20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.677742 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.677788 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5a6b043-388f-4a9e-9a21-ec65235abf20-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.677802 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bl2m\" (UniqueName: \"kubernetes.io/projected/c5a6b043-388f-4a9e-9a21-ec65235abf20-kube-api-access-7bl2m\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.803581 4921 generic.go:334] "Generic (PLEG): container finished" podID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerID="6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d" exitCode=0 Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.803619 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerDied","Data":"6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d"} Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.803667 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldkf5" event={"ID":"c5a6b043-388f-4a9e-9a21-ec65235abf20","Type":"ContainerDied","Data":"a066688d7a3207d5062633c5239e9a133747c4e4629ef0697a16db3db09d2073"} Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.803688 4921 scope.go:117] "RemoveContainer" containerID="6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.803635 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldkf5" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.803946 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w7cwk" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="registry-server" containerID="cri-o://fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192" gracePeriod=2 Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.842243 4921 scope.go:117] "RemoveContainer" containerID="0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.926897 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ldkf5"] Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.934563 4921 scope.go:117] "RemoveContainer" containerID="10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9" Mar 18 13:56:02 crc kubenswrapper[4921]: I0318 13:56:02.935457 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ldkf5"] Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.019853 4921 scope.go:117] "RemoveContainer" containerID="6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d" Mar 18 13:56:03 crc kubenswrapper[4921]: E0318 13:56:03.020377 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d\": container with ID starting with 6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d not found: ID does not exist" containerID="6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.020430 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d"} err="failed to get container status \"6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d\": rpc error: code = NotFound desc = could not find container \"6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d\": container with ID starting with 6dde2ee21fc23fb68d76d101dc5ce706d03b4e1bcc7ac4488fb8b8a2cf88d55d not found: ID does not exist" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.020456 4921 scope.go:117] "RemoveContainer" containerID="0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa" Mar 18 13:56:03 crc kubenswrapper[4921]: E0318 13:56:03.021200 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa\": container with ID starting with 0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa not found: ID does not exist" containerID="0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.021237 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa"} err="failed to get container status \"0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa\": rpc error: code = NotFound desc = could not find container \"0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa\": container with ID starting with 0d85695c5b7ad5f30b8ac199734939933342e4831fce50fb13a83485c35ebefa not found: ID does not exist" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.021262 4921 scope.go:117] "RemoveContainer" containerID="10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9" Mar 18 13:56:03 crc kubenswrapper[4921]: E0318 13:56:03.021653 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9\": container with ID starting with 10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9 not found: ID does not exist" containerID="10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.021693 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9"} err="failed to get container status \"10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9\": rpc error: code = NotFound desc = could not find container \"10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9\": container with ID starting with 10c8abdd61632846bab96f1c625d5e2e195976085035e2ee17cc2b25a5feb8d9 not found: ID does not exist" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.234981 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" path="/var/lib/kubelet/pods/c5a6b043-388f-4a9e-9a21-ec65235abf20/volumes" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.312490 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.403037 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-utilities\") pod \"082ed6d1-fe58-4faf-9b82-7c5292191511\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.403344 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8tc\" (UniqueName: \"kubernetes.io/projected/082ed6d1-fe58-4faf-9b82-7c5292191511-kube-api-access-7f8tc\") pod \"082ed6d1-fe58-4faf-9b82-7c5292191511\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.403885 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-catalog-content\") pod \"082ed6d1-fe58-4faf-9b82-7c5292191511\" (UID: \"082ed6d1-fe58-4faf-9b82-7c5292191511\") " Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.415306 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-utilities" (OuterVolumeSpecName: "utilities") pod "082ed6d1-fe58-4faf-9b82-7c5292191511" (UID: "082ed6d1-fe58-4faf-9b82-7c5292191511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.416036 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082ed6d1-fe58-4faf-9b82-7c5292191511-kube-api-access-7f8tc" (OuterVolumeSpecName: "kube-api-access-7f8tc") pod "082ed6d1-fe58-4faf-9b82-7c5292191511" (UID: "082ed6d1-fe58-4faf-9b82-7c5292191511"). InnerVolumeSpecName "kube-api-access-7f8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.435394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "082ed6d1-fe58-4faf-9b82-7c5292191511" (UID: "082ed6d1-fe58-4faf-9b82-7c5292191511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.509225 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.509260 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f8tc\" (UniqueName: \"kubernetes.io/projected/082ed6d1-fe58-4faf-9b82-7c5292191511-kube-api-access-7f8tc\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.509272 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082ed6d1-fe58-4faf-9b82-7c5292191511-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.821751 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" event={"ID":"d18516be-a913-4d85-9d32-265cc891709a","Type":"ContainerStarted","Data":"096664877cf08671ec5e2aadfd8b4fe3c9b8263f3852f0bc32d0e6986e8d750c"} Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.827422 4921 generic.go:334] "Generic (PLEG): container finished" podID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerID="fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192" exitCode=0 Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.827485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerDied","Data":"fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192"} Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.827516 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7cwk" event={"ID":"082ed6d1-fe58-4faf-9b82-7c5292191511","Type":"ContainerDied","Data":"1cac98a192fb8ed293facc715707a72d2bdc52f79c61cba1abd10b8576668e13"} Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.827522 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7cwk" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.827666 4921 scope.go:117] "RemoveContainer" containerID="fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.885414 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" podStartSLOduration=2.564752563 podStartE2EDuration="3.885391938s" podCreationTimestamp="2026-03-18 13:56:00 +0000 UTC" firstStartedPulling="2026-03-18 13:56:01.0659781 +0000 UTC m=+6380.615898739" lastFinishedPulling="2026-03-18 13:56:02.386617475 +0000 UTC m=+6381.936538114" observedRunningTime="2026-03-18 13:56:03.837177786 +0000 UTC m=+6383.387098445" watchObservedRunningTime="2026-03-18 13:56:03.885391938 +0000 UTC m=+6383.435312577" Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.889417 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7cwk"] Mar 18 13:56:03 crc kubenswrapper[4921]: I0318 13:56:03.897661 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7cwk"] Mar 18 13:56:04 crc kubenswrapper[4921]: I0318 13:56:04.855878 4921 generic.go:334] "Generic (PLEG): container finished" podID="d18516be-a913-4d85-9d32-265cc891709a" containerID="096664877cf08671ec5e2aadfd8b4fe3c9b8263f3852f0bc32d0e6986e8d750c" exitCode=0 Mar 18 13:56:04 crc kubenswrapper[4921]: I0318 13:56:04.856253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" event={"ID":"d18516be-a913-4d85-9d32-265cc891709a","Type":"ContainerDied","Data":"096664877cf08671ec5e2aadfd8b4fe3c9b8263f3852f0bc32d0e6986e8d750c"} Mar 18 13:56:05 crc kubenswrapper[4921]: I0318 13:56:05.222794 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" path="/var/lib/kubelet/pods/082ed6d1-fe58-4faf-9b82-7c5292191511/volumes" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.117807 4921 scope.go:117] "RemoveContainer" containerID="bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.162899 4921 scope.go:117] "RemoveContainer" containerID="300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.309325 4921 scope.go:117] "RemoveContainer" containerID="fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192" Mar 18 13:56:06 crc kubenswrapper[4921]: E0318 13:56:06.309944 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192\": container with ID starting with fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192 not found: ID does not exist" containerID="fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.310022 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192"} err="failed to get container status \"fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192\": rpc error: code = NotFound desc = could not find container \"fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192\": container with ID starting with fa3450d2fe5e1565c9d238a6ada3e0901b10823eef8561ce0ab1299785d4c192 not found: ID does not exist" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.310064 4921 scope.go:117] "RemoveContainer" containerID="bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc" Mar 18 13:56:06 crc kubenswrapper[4921]: E0318 13:56:06.310677 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc\": container with ID starting with bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc not found: ID does not exist" containerID="bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.310734 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc"} err="failed to get container status \"bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc\": rpc error: code = NotFound desc = could not find container \"bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc\": container with ID starting with bd8b7a683c958fb53aa42a984518831d761f059c9e6aaed49ce9b08c205a15dc not found: ID does not exist" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.310779 4921 scope.go:117] "RemoveContainer" containerID="300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35" Mar 18 13:56:06 crc kubenswrapper[4921]: E0318 13:56:06.311409 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35\": container with ID starting with 300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35 not found: ID does not exist" containerID="300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.311485 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35"} err="failed to get container status \"300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35\": rpc error: code = NotFound desc = could not find container \"300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35\": container with ID starting with 300fd8679c56f0c9fc2428df65e25e73e4a3797ebbff12d01d726c6522f0ba35 not found: ID does not exist" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.408548 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.474128 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqcd4\" (UniqueName: \"kubernetes.io/projected/d18516be-a913-4d85-9d32-265cc891709a-kube-api-access-qqcd4\") pod \"d18516be-a913-4d85-9d32-265cc891709a\" (UID: \"d18516be-a913-4d85-9d32-265cc891709a\") " Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.496429 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18516be-a913-4d85-9d32-265cc891709a-kube-api-access-qqcd4" (OuterVolumeSpecName: "kube-api-access-qqcd4") pod "d18516be-a913-4d85-9d32-265cc891709a" (UID: "d18516be-a913-4d85-9d32-265cc891709a"). InnerVolumeSpecName "kube-api-access-qqcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.576663 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqcd4\" (UniqueName: \"kubernetes.io/projected/d18516be-a913-4d85-9d32-265cc891709a-kube-api-access-qqcd4\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.899763 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-w92g6" event={"ID":"3ef32792-2603-4b63-8bf7-3413655270db","Type":"ContainerStarted","Data":"2f765237217e582f834742b79cbfed112a1d81853c0d2b3a85d5567135f55fcc"} Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.902922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" event={"ID":"d18516be-a913-4d85-9d32-265cc891709a","Type":"ContainerDied","Data":"f4f5498d01cd51a428c4bc5305578a40781068e59316a8b9d12f7ed8590deda2"} Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.903053 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f5498d01cd51a428c4bc5305578a40781068e59316a8b9d12f7ed8590deda2" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.902961 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564036-wfcm4" Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.912055 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-8csrm"] Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.922894 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-8csrm"] Mar 18 13:56:06 crc kubenswrapper[4921]: I0318 13:56:06.927612 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-w92g6" podStartSLOduration=2.463648493 podStartE2EDuration="6.927593521s" podCreationTimestamp="2026-03-18 13:56:00 +0000 UTC" firstStartedPulling="2026-03-18 13:56:01.70022478 +0000 UTC m=+6381.250145419" lastFinishedPulling="2026-03-18 13:56:06.164169808 +0000 UTC m=+6385.714090447" observedRunningTime="2026-03-18 13:56:06.917695188 +0000 UTC m=+6386.467615827" watchObservedRunningTime="2026-03-18 13:56:06.927593521 +0000 UTC m=+6386.477514160" Mar 18 13:56:07 crc kubenswrapper[4921]: I0318 13:56:07.220332 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad24e41-3dca-4d97-b126-6692b454bc28" path="/var/lib/kubelet/pods/0ad24e41-3dca-4d97-b126-6692b454bc28/volumes" Mar 18 13:56:08 crc kubenswrapper[4921]: I0318 13:56:08.922015 4921 generic.go:334] "Generic (PLEG): container finished" podID="3ef32792-2603-4b63-8bf7-3413655270db" containerID="2f765237217e582f834742b79cbfed112a1d81853c0d2b3a85d5567135f55fcc" exitCode=0 Mar 18 13:56:08 crc kubenswrapper[4921]: I0318 13:56:08.922130 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-w92g6" event={"ID":"3ef32792-2603-4b63-8bf7-3413655270db","Type":"ContainerDied","Data":"2f765237217e582f834742b79cbfed112a1d81853c0d2b3a85d5567135f55fcc"} Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.375894 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.579737 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5zv6\" (UniqueName: \"kubernetes.io/projected/3ef32792-2603-4b63-8bf7-3413655270db-kube-api-access-x5zv6\") pod \"3ef32792-2603-4b63-8bf7-3413655270db\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.580251 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-combined-ca-bundle\") pod \"3ef32792-2603-4b63-8bf7-3413655270db\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.580283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-scripts\") pod \"3ef32792-2603-4b63-8bf7-3413655270db\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.580481 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-config-data\") pod \"3ef32792-2603-4b63-8bf7-3413655270db\" (UID: \"3ef32792-2603-4b63-8bf7-3413655270db\") " Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.594474 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef32792-2603-4b63-8bf7-3413655270db-kube-api-access-x5zv6" (OuterVolumeSpecName: "kube-api-access-x5zv6") pod "3ef32792-2603-4b63-8bf7-3413655270db" (UID: "3ef32792-2603-4b63-8bf7-3413655270db"). InnerVolumeSpecName "kube-api-access-x5zv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.599487 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-scripts" (OuterVolumeSpecName: "scripts") pod "3ef32792-2603-4b63-8bf7-3413655270db" (UID: "3ef32792-2603-4b63-8bf7-3413655270db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.610519 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ef32792-2603-4b63-8bf7-3413655270db" (UID: "3ef32792-2603-4b63-8bf7-3413655270db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.622243 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-config-data" (OuterVolumeSpecName: "config-data") pod "3ef32792-2603-4b63-8bf7-3413655270db" (UID: "3ef32792-2603-4b63-8bf7-3413655270db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.684958 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5zv6\" (UniqueName: \"kubernetes.io/projected/3ef32792-2603-4b63-8bf7-3413655270db-kube-api-access-x5zv6\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.685025 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.685035 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.685045 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef32792-2603-4b63-8bf7-3413655270db-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.944774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-w92g6" event={"ID":"3ef32792-2603-4b63-8bf7-3413655270db","Type":"ContainerDied","Data":"c1be0672bf90f30b77de4ee890d77e85f12c767219dc6d2448d85ca12880be88"} Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.945221 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1be0672bf90f30b77de4ee890d77e85f12c767219dc6d2448d85ca12880be88" Mar 18 13:56:10 crc kubenswrapper[4921]: I0318 13:56:10.944868 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-w92g6" Mar 18 13:56:13 crc kubenswrapper[4921]: I0318 13:56:13.196928 4921 scope.go:117] "RemoveContainer" containerID="17ae14acdc66e55d38a0ff3c961fd7c68afac34991ffb89928aa230c27349a56" Mar 18 13:56:13 crc kubenswrapper[4921]: I0318 13:56:13.247427 4921 scope.go:117] "RemoveContainer" containerID="bc93348d1519329bb49846c74df5daf0078783ff1a6ba57f4fd453e053c4e7b0" Mar 18 13:56:13 crc kubenswrapper[4921]: I0318 13:56:13.283060 4921 scope.go:117] "RemoveContainer" containerID="e97220eb3c36c88dd7c95e7b0b6347c9f9d43ed9936de0f523fea1693b0a7006" Mar 18 13:56:13 crc kubenswrapper[4921]: I0318 13:56:13.342846 4921 scope.go:117] "RemoveContainer" containerID="7a854a2db0099288bc2fb4296cb843e15edd5f18e2703086f2befc7e7b0bf2ce" Mar 18 13:56:13 crc kubenswrapper[4921]: I0318 13:56:13.386766 4921 scope.go:117] "RemoveContainer" containerID="62b849949f940a3874e26e5f7b3d5ebd9b5f2ac08f839c1ec1399e9d6ef5c94f" Mar 18 13:56:13 crc kubenswrapper[4921]: I0318 13:56:13.961911 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.209948 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.210309 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.925579 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926628 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="extract-utilities" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926652 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="extract-utilities" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926664 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef32792-2603-4b63-8bf7-3413655270db" containerName="aodh-db-sync" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926674 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef32792-2603-4b63-8bf7-3413655270db" containerName="aodh-db-sync" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926706 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="registry-server" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926715 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="registry-server" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926733 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="extract-content" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926741 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="extract-content" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926757 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="extract-content" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926765 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="extract-content" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926779 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="extract-utilities" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926786 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="extract-utilities" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926807 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18516be-a913-4d85-9d32-265cc891709a" containerName="oc" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926814 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18516be-a913-4d85-9d32-265cc891709a" containerName="oc" Mar 18 13:56:15 crc kubenswrapper[4921]: E0318 13:56:15.926832 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="registry-server" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.926840 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="registry-server" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.927087 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a6b043-388f-4a9e-9a21-ec65235abf20" containerName="registry-server" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.927129 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="082ed6d1-fe58-4faf-9b82-7c5292191511" containerName="registry-server" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.927143 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef32792-2603-4b63-8bf7-3413655270db" containerName="aodh-db-sync" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.927160 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18516be-a913-4d85-9d32-265cc891709a" containerName="oc" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.929611 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.933714 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.935148 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.935434 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-8tc9k" Mar 18 13:56:15 crc kubenswrapper[4921]: I0318 13:56:15.944328 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.091682 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7jh\" (UniqueName: \"kubernetes.io/projected/7dc34189-0095-41c5-8847-f91ddb972ce0-kube-api-access-7t7jh\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.091779 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-config-data\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.092186 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-scripts\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.092264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.194416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-scripts\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.194463 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.194555 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7jh\" (UniqueName: \"kubernetes.io/projected/7dc34189-0095-41c5-8847-f91ddb972ce0-kube-api-access-7t7jh\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.194623 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-config-data\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.200346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.201277 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-scripts\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.203264 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dc34189-0095-41c5-8847-f91ddb972ce0-config-data\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.213363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7jh\" (UniqueName: \"kubernetes.io/projected/7dc34189-0095-41c5-8847-f91ddb972ce0-kube-api-access-7t7jh\") pod \"aodh-0\" (UID: \"7dc34189-0095-41c5-8847-f91ddb972ce0\") " pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.261718 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 13:56:16 crc kubenswrapper[4921]: I0318 13:56:16.795719 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 13:56:17 crc kubenswrapper[4921]: I0318 13:56:17.001683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7dc34189-0095-41c5-8847-f91ddb972ce0","Type":"ContainerStarted","Data":"b5366bd46b40bd3ebf4a272c03bd72f6f4f0edc0d93d9b41c909b7349e636690"} Mar 18 13:56:18 crc kubenswrapper[4921]: I0318 13:56:18.017657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7dc34189-0095-41c5-8847-f91ddb972ce0","Type":"ContainerStarted","Data":"9208b069d8b21bc061929f8486c37b68383adfb981ce98ced72924ecee58374f"} Mar 18 13:56:18 crc kubenswrapper[4921]: I0318 13:56:18.382055 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:18 crc kubenswrapper[4921]: I0318 13:56:18.382557 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-central-agent" containerID="cri-o://acf1ebbb0f74c1f0254b7bce7af00d6a9886f37720583481d140271cea8c4522" gracePeriod=30 Mar 18 13:56:18 crc kubenswrapper[4921]: I0318 13:56:18.382647 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-notification-agent" containerID="cri-o://e9f8ba56ac3a6ecb7cbcf33f0b6f75b254142fa773ec6e6637b19dad339ecf22" gracePeriod=30 Mar 18 13:56:18 crc kubenswrapper[4921]: I0318 13:56:18.382652 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="sg-core" containerID="cri-o://92acc74489094357239191794278d92047fbfd8763622cbb29e69f48d8a49bf1" gracePeriod=30 Mar 18 13:56:18 crc kubenswrapper[4921]: I0318 13:56:18.382633 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="proxy-httpd" containerID="cri-o://b92a6af7b4f9ecfa2c96a60d6c1362cd557f76a4a811357d282d4bd7cd2e44fa" gracePeriod=30 Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062650 4921 generic.go:334] "Generic (PLEG): container finished" podID="cea7f817-98f3-42d6-a9ee-73183d043599" containerID="b92a6af7b4f9ecfa2c96a60d6c1362cd557f76a4a811357d282d4bd7cd2e44fa" exitCode=0 Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062888 4921 generic.go:334] "Generic (PLEG): container finished" podID="cea7f817-98f3-42d6-a9ee-73183d043599" containerID="92acc74489094357239191794278d92047fbfd8763622cbb29e69f48d8a49bf1" exitCode=2 Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062897 4921 generic.go:334] "Generic (PLEG): container finished" podID="cea7f817-98f3-42d6-a9ee-73183d043599" containerID="e9f8ba56ac3a6ecb7cbcf33f0b6f75b254142fa773ec6e6637b19dad339ecf22" exitCode=0 Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062904 4921 generic.go:334] "Generic (PLEG): container finished" podID="cea7f817-98f3-42d6-a9ee-73183d043599" containerID="acf1ebbb0f74c1f0254b7bce7af00d6a9886f37720583481d140271cea8c4522" exitCode=0 Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062936 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerDied","Data":"b92a6af7b4f9ecfa2c96a60d6c1362cd557f76a4a811357d282d4bd7cd2e44fa"} Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062963 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerDied","Data":"92acc74489094357239191794278d92047fbfd8763622cbb29e69f48d8a49bf1"} Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062972 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerDied","Data":"e9f8ba56ac3a6ecb7cbcf33f0b6f75b254142fa773ec6e6637b19dad339ecf22"} Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.062980 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerDied","Data":"acf1ebbb0f74c1f0254b7bce7af00d6a9886f37720583481d140271cea8c4522"} Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.314711 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.485459 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-log-httpd\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.485792 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-combined-ca-bundle\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.485870 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-config-data\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.485943 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-scripts\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.485992 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-sg-core-conf-yaml\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.486039 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsv8j\" (UniqueName: \"kubernetes.io/projected/cea7f817-98f3-42d6-a9ee-73183d043599-kube-api-access-jsv8j\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.486088 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-run-httpd\") pod \"cea7f817-98f3-42d6-a9ee-73183d043599\" (UID: \"cea7f817-98f3-42d6-a9ee-73183d043599\") " Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.486088 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.486882 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.486893 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.491215 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-scripts" (OuterVolumeSpecName: "scripts") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.494260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea7f817-98f3-42d6-a9ee-73183d043599-kube-api-access-jsv8j" (OuterVolumeSpecName: "kube-api-access-jsv8j") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "kube-api-access-jsv8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.517826 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.578204 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.589402 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.589450 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.589466 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsv8j\" (UniqueName: \"kubernetes.io/projected/cea7f817-98f3-42d6-a9ee-73183d043599-kube-api-access-jsv8j\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.589480 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea7f817-98f3-42d6-a9ee-73183d043599-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.589494 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.603342 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-config-data" (OuterVolumeSpecName: "config-data") pod "cea7f817-98f3-42d6-a9ee-73183d043599" (UID: "cea7f817-98f3-42d6-a9ee-73183d043599"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:19 crc kubenswrapper[4921]: I0318 13:56:19.691895 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea7f817-98f3-42d6-a9ee-73183d043599-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.072900 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7dc34189-0095-41c5-8847-f91ddb972ce0","Type":"ContainerStarted","Data":"e214ea0a6205a7c83ae22f588cc11b2a1716db9ab0998c322fcac03679df0f84"} Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.080542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea7f817-98f3-42d6-a9ee-73183d043599","Type":"ContainerDied","Data":"ee728694a484c1fd89ffd5c8d1de632690c3d59e4f8ecd4e4b7bdf083f7d6f82"} Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.080573 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.080606 4921 scope.go:117] "RemoveContainer" containerID="b92a6af7b4f9ecfa2c96a60d6c1362cd557f76a4a811357d282d4bd7cd2e44fa" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.146256 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.161863 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.173764 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:20 crc kubenswrapper[4921]: E0318 13:56:20.174290 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-notification-agent" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174312 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-notification-agent" Mar 18 13:56:20 crc kubenswrapper[4921]: E0318 13:56:20.174341 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="sg-core" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174348 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="sg-core" Mar 18 13:56:20 crc kubenswrapper[4921]: E0318 13:56:20.174358 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-central-agent" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174364 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-central-agent" Mar 18 13:56:20 crc kubenswrapper[4921]: E0318 13:56:20.174384 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="proxy-httpd" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174391 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="proxy-httpd" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174568 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-central-agent" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="proxy-httpd" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174600 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="sg-core" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.174618 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" containerName="ceilometer-notification-agent" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.177225 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.181788 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.182149 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.189213 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.305827 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4xtq\" (UniqueName: \"kubernetes.io/projected/171a2627-97a7-40a9-b604-1528694f1de0-kube-api-access-f4xtq\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.305987 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-run-httpd\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.306013 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-scripts\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.306093 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-log-httpd\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.306187 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.306213 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-config-data\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.306252 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.318308 4921 scope.go:117] "RemoveContainer" containerID="92acc74489094357239191794278d92047fbfd8763622cbb29e69f48d8a49bf1" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.363330 4921 scope.go:117] "RemoveContainer" containerID="e9f8ba56ac3a6ecb7cbcf33f0b6f75b254142fa773ec6e6637b19dad339ecf22" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.385169 4921 scope.go:117] "RemoveContainer" containerID="acf1ebbb0f74c1f0254b7bce7af00d6a9886f37720583481d140271cea8c4522" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.407859 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-run-httpd\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.407918 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-scripts\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.408006 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-log-httpd\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.408088 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.408135 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-config-data\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.408182 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.408219 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4xtq\" (UniqueName: \"kubernetes.io/projected/171a2627-97a7-40a9-b604-1528694f1de0-kube-api-access-f4xtq\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.408375 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-run-httpd\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.410305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-log-httpd\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.413502 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.413781 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-config-data\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.413909 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.417259 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-scripts\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.427401 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4xtq\" (UniqueName: \"kubernetes.io/projected/171a2627-97a7-40a9-b604-1528694f1de0-kube-api-access-f4xtq\") pod \"ceilometer-0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " pod="openstack/ceilometer-0" Mar 18 13:56:20 crc kubenswrapper[4921]: I0318 13:56:20.497905 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:21 crc kubenswrapper[4921]: I0318 13:56:21.047831 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:21 crc kubenswrapper[4921]: I0318 13:56:21.092792 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerStarted","Data":"c8dcc370c79ec34f316c200ed812946f7dad105ba6473054f66df3ea31c5ed7d"} Mar 18 13:56:21 crc kubenswrapper[4921]: I0318 13:56:21.095081 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7dc34189-0095-41c5-8847-f91ddb972ce0","Type":"ContainerStarted","Data":"6949f263c39ff260ec06288fa83d704e0d3e3f50afd7aaaa58228103c169c588"} Mar 18 13:56:21 crc kubenswrapper[4921]: I0318 13:56:21.223008 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea7f817-98f3-42d6-a9ee-73183d043599" path="/var/lib/kubelet/pods/cea7f817-98f3-42d6-a9ee-73183d043599/volumes" Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.044451 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3c32-account-create-update-dx6t7"] Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.054973 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-fsljz"] Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.067206 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3c32-account-create-update-dx6t7"] Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.077090 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-fsljz"] Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.117853 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7dc34189-0095-41c5-8847-f91ddb972ce0","Type":"ContainerStarted","Data":"a53fe3d8df9afb0e75b5f23cb3ebe1c56af4f4ea9fdf4a5a489c0acd6bc49449"} Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.120033 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerStarted","Data":"08840d8766598cc69601fc00b78e28da71c3db9af550624bd98ce5c049bde9da"} Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.143530 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.881874351 podStartE2EDuration="8.143505015s" podCreationTimestamp="2026-03-18 13:56:15 +0000 UTC" firstStartedPulling="2026-03-18 13:56:16.783033124 +0000 UTC m=+6396.332953773" lastFinishedPulling="2026-03-18 13:56:22.044663798 +0000 UTC m=+6401.594584437" observedRunningTime="2026-03-18 13:56:23.13913023 +0000 UTC m=+6402.689050889" watchObservedRunningTime="2026-03-18 13:56:23.143505015 +0000 UTC m=+6402.693425674" Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.230055 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e" path="/var/lib/kubelet/pods/4ce3ff25-4ac8-4b4a-b608-a3904b2d4d8e/volumes" Mar 18 13:56:23 crc kubenswrapper[4921]: I0318 13:56:23.231726 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc696a06-c0a8-4f71-814c-49242aab9b67" path="/var/lib/kubelet/pods/bc696a06-c0a8-4f71-814c-49242aab9b67/volumes" Mar 18 13:56:24 crc kubenswrapper[4921]: I0318 13:56:24.132599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerStarted","Data":"6fd78cbe4c63ded946497591c1848e63068677d19e5018c19e326e2eac05f7b2"} Mar 18 13:56:24 crc kubenswrapper[4921]: I0318 13:56:24.133206 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerStarted","Data":"9d4f589e5a1cf0b2a5b947d09f7c05231dc5c5834d3a492674a7d6f100a42566"} Mar 18 13:56:27 crc kubenswrapper[4921]: I0318 13:56:27.175102 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerStarted","Data":"2940d151e8024eb0a893ccf7761d3fac67bc867c50a3c28e13d83f4284a7d192"} Mar 18 13:56:27 crc kubenswrapper[4921]: I0318 13:56:27.176006 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:56:27 crc kubenswrapper[4921]: I0318 13:56:27.205531 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.209960765 podStartE2EDuration="7.205508842s" podCreationTimestamp="2026-03-18 13:56:20 +0000 UTC" firstStartedPulling="2026-03-18 13:56:21.042057688 +0000 UTC m=+6400.591978327" lastFinishedPulling="2026-03-18 13:56:26.037605765 +0000 UTC m=+6405.587526404" observedRunningTime="2026-03-18 13:56:27.196530764 +0000 UTC m=+6406.746451403" watchObservedRunningTime="2026-03-18 13:56:27.205508842 +0000 UTC m=+6406.755429481" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.554606 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-frttr"] Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.556745 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.563822 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-frttr"] Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.660360 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-efa8-account-create-update-5pr48"] Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.661834 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.664180 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.682139 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-efa8-account-create-update-5pr48"] Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.709486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6m5w\" (UniqueName: \"kubernetes.io/projected/69c55e47-621b-45db-8447-543e81a28036-kube-api-access-q6m5w\") pod \"manila-db-create-frttr\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.709622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c55e47-621b-45db-8447-543e81a28036-operator-scripts\") pod \"manila-db-create-frttr\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.811462 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvgs\" (UniqueName: \"kubernetes.io/projected/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-kube-api-access-6jvgs\") pod \"manila-efa8-account-create-update-5pr48\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.811567 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6m5w\" (UniqueName: \"kubernetes.io/projected/69c55e47-621b-45db-8447-543e81a28036-kube-api-access-q6m5w\") pod \"manila-db-create-frttr\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.811664 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c55e47-621b-45db-8447-543e81a28036-operator-scripts\") pod \"manila-db-create-frttr\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.811696 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-operator-scripts\") pod \"manila-efa8-account-create-update-5pr48\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.812546 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c55e47-621b-45db-8447-543e81a28036-operator-scripts\") pod \"manila-db-create-frttr\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.852827 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6m5w\" (UniqueName: \"kubernetes.io/projected/69c55e47-621b-45db-8447-543e81a28036-kube-api-access-q6m5w\") pod \"manila-db-create-frttr\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.889378 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frttr" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.913769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-operator-scripts\") pod \"manila-efa8-account-create-update-5pr48\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.913886 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvgs\" (UniqueName: \"kubernetes.io/projected/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-kube-api-access-6jvgs\") pod \"manila-efa8-account-create-update-5pr48\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.917097 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-operator-scripts\") pod \"manila-efa8-account-create-update-5pr48\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.931958 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvgs\" (UniqueName: \"kubernetes.io/projected/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-kube-api-access-6jvgs\") pod \"manila-efa8-account-create-update-5pr48\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:28 crc kubenswrapper[4921]: I0318 13:56:28.980504 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:29 crc kubenswrapper[4921]: I0318 13:56:29.447518 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-frttr"] Mar 18 13:56:29 crc kubenswrapper[4921]: I0318 13:56:29.613006 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-efa8-account-create-update-5pr48"] Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.204853 4921 generic.go:334] "Generic (PLEG): container finished" podID="b0fc5394-7a82-43c6-88fd-ccec001cf9a5" containerID="b4b72efa4d8b79225c60d5cc754b68fee62e122a73f2a61b4fc7544c3352f66e" exitCode=0 Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.205007 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-efa8-account-create-update-5pr48" event={"ID":"b0fc5394-7a82-43c6-88fd-ccec001cf9a5","Type":"ContainerDied","Data":"b4b72efa4d8b79225c60d5cc754b68fee62e122a73f2a61b4fc7544c3352f66e"} Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.205318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-efa8-account-create-update-5pr48" event={"ID":"b0fc5394-7a82-43c6-88fd-ccec001cf9a5","Type":"ContainerStarted","Data":"19b401b88f32ab90ed1445cd05d987ffc250f8805cdbe3f3258471b9127893f9"} Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.207806 4921 generic.go:334] "Generic (PLEG): container finished" podID="69c55e47-621b-45db-8447-543e81a28036" containerID="6d1593444d1d05e1625ed5656a5748ffb87504be92122a82bb356102bae4e77d" exitCode=0 Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.207835 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-frttr" event={"ID":"69c55e47-621b-45db-8447-543e81a28036","Type":"ContainerDied","Data":"6d1593444d1d05e1625ed5656a5748ffb87504be92122a82bb356102bae4e77d"} Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.207851 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-frttr" event={"ID":"69c55e47-621b-45db-8447-543e81a28036","Type":"ContainerStarted","Data":"291ccc0a05ec14856a127063b4014a1d6fdb870c73511f002140ea078afd8899"} Mar 18 13:56:30 crc kubenswrapper[4921]: I0318 13:56:30.210014 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:56:30 crc kubenswrapper[4921]: E0318 13:56:30.210248 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.075309 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-hv9mx"] Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.091382 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-hv9mx"] Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.229438 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438dbf8f-1745-44e0-aadc-b8a6d07598cb" path="/var/lib/kubelet/pods/438dbf8f-1745-44e0-aadc-b8a6d07598cb/volumes" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.780678 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frttr" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.789431 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.894044 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6m5w\" (UniqueName: \"kubernetes.io/projected/69c55e47-621b-45db-8447-543e81a28036-kube-api-access-q6m5w\") pod \"69c55e47-621b-45db-8447-543e81a28036\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.894129 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jvgs\" (UniqueName: \"kubernetes.io/projected/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-kube-api-access-6jvgs\") pod \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.894170 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c55e47-621b-45db-8447-543e81a28036-operator-scripts\") pod \"69c55e47-621b-45db-8447-543e81a28036\" (UID: \"69c55e47-621b-45db-8447-543e81a28036\") " Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.894256 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-operator-scripts\") pod \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\" (UID: \"b0fc5394-7a82-43c6-88fd-ccec001cf9a5\") " Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.894753 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69c55e47-621b-45db-8447-543e81a28036-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69c55e47-621b-45db-8447-543e81a28036" (UID: "69c55e47-621b-45db-8447-543e81a28036"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.894859 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0fc5394-7a82-43c6-88fd-ccec001cf9a5" (UID: "b0fc5394-7a82-43c6-88fd-ccec001cf9a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.903337 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c55e47-621b-45db-8447-543e81a28036-kube-api-access-q6m5w" (OuterVolumeSpecName: "kube-api-access-q6m5w") pod "69c55e47-621b-45db-8447-543e81a28036" (UID: "69c55e47-621b-45db-8447-543e81a28036"). InnerVolumeSpecName "kube-api-access-q6m5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.903434 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-kube-api-access-6jvgs" (OuterVolumeSpecName: "kube-api-access-6jvgs") pod "b0fc5394-7a82-43c6-88fd-ccec001cf9a5" (UID: "b0fc5394-7a82-43c6-88fd-ccec001cf9a5"). InnerVolumeSpecName "kube-api-access-6jvgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.996152 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6m5w\" (UniqueName: \"kubernetes.io/projected/69c55e47-621b-45db-8447-543e81a28036-kube-api-access-q6m5w\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.996187 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jvgs\" (UniqueName: \"kubernetes.io/projected/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-kube-api-access-6jvgs\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.996198 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69c55e47-621b-45db-8447-543e81a28036-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:31 crc kubenswrapper[4921]: I0318 13:56:31.996206 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fc5394-7a82-43c6-88fd-ccec001cf9a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:32 crc kubenswrapper[4921]: I0318 13:56:32.235948 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-frttr" event={"ID":"69c55e47-621b-45db-8447-543e81a28036","Type":"ContainerDied","Data":"291ccc0a05ec14856a127063b4014a1d6fdb870c73511f002140ea078afd8899"} Mar 18 13:56:32 crc kubenswrapper[4921]: I0318 13:56:32.235992 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="291ccc0a05ec14856a127063b4014a1d6fdb870c73511f002140ea078afd8899" Mar 18 13:56:32 crc kubenswrapper[4921]: I0318 13:56:32.236006 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frttr" Mar 18 13:56:32 crc kubenswrapper[4921]: I0318 13:56:32.238199 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-efa8-account-create-update-5pr48" event={"ID":"b0fc5394-7a82-43c6-88fd-ccec001cf9a5","Type":"ContainerDied","Data":"19b401b88f32ab90ed1445cd05d987ffc250f8805cdbe3f3258471b9127893f9"} Mar 18 13:56:32 crc kubenswrapper[4921]: I0318 13:56:32.238978 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b401b88f32ab90ed1445cd05d987ffc250f8805cdbe3f3258471b9127893f9" Mar 18 13:56:32 crc kubenswrapper[4921]: I0318 13:56:32.238314 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-efa8-account-create-update-5pr48" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.016250 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-n7ftq"] Mar 18 13:56:34 crc kubenswrapper[4921]: E0318 13:56:34.017640 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc5394-7a82-43c6-88fd-ccec001cf9a5" containerName="mariadb-account-create-update" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.017661 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc5394-7a82-43c6-88fd-ccec001cf9a5" containerName="mariadb-account-create-update" Mar 18 13:56:34 crc kubenswrapper[4921]: E0318 13:56:34.017687 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c55e47-621b-45db-8447-543e81a28036" containerName="mariadb-database-create" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.017694 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c55e47-621b-45db-8447-543e81a28036" containerName="mariadb-database-create" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.018204 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fc5394-7a82-43c6-88fd-ccec001cf9a5" containerName="mariadb-account-create-update" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.018237 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c55e47-621b-45db-8447-543e81a28036" containerName="mariadb-database-create" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.019305 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.024578 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.024844 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-plh59" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.030043 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-n7ftq"] Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.150193 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-combined-ca-bundle\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.150242 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-job-config-data\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.150411 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fccm\" (UniqueName: \"kubernetes.io/projected/b0baadf0-64a5-45b5-9e26-491b274ea3d4-kube-api-access-2fccm\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.150458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-config-data\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.252718 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-combined-ca-bundle\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.252789 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-job-config-data\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.252950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fccm\" (UniqueName: \"kubernetes.io/projected/b0baadf0-64a5-45b5-9e26-491b274ea3d4-kube-api-access-2fccm\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.252991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-config-data\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.259300 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-job-config-data\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.260141 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-config-data\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.260715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-combined-ca-bundle\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.275817 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fccm\" (UniqueName: \"kubernetes.io/projected/b0baadf0-64a5-45b5-9e26-491b274ea3d4-kube-api-access-2fccm\") pod \"manila-db-sync-n7ftq\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:34 crc kubenswrapper[4921]: I0318 13:56:34.351947 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:35 crc kubenswrapper[4921]: W0318 13:56:35.250460 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0baadf0_64a5_45b5_9e26_491b274ea3d4.slice/crio-b6a0578aa08f0e9858b021d4d66620f0fce386c45283cdeab20e9366b7561a6d WatchSource:0}: Error finding container b6a0578aa08f0e9858b021d4d66620f0fce386c45283cdeab20e9366b7561a6d: Status 404 returned error can't find the container with id b6a0578aa08f0e9858b021d4d66620f0fce386c45283cdeab20e9366b7561a6d Mar 18 13:56:35 crc kubenswrapper[4921]: I0318 13:56:35.253829 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-n7ftq"] Mar 18 13:56:35 crc kubenswrapper[4921]: I0318 13:56:35.271578 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-n7ftq" event={"ID":"b0baadf0-64a5-45b5-9e26-491b274ea3d4","Type":"ContainerStarted","Data":"b6a0578aa08f0e9858b021d4d66620f0fce386c45283cdeab20e9366b7561a6d"} Mar 18 13:56:41 crc kubenswrapper[4921]: I0318 13:56:41.363479 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-n7ftq" event={"ID":"b0baadf0-64a5-45b5-9e26-491b274ea3d4","Type":"ContainerStarted","Data":"d89abc3c68ec704d1323a0ab3498afe2d3b8133247a30f47efe56be42b09929a"} Mar 18 13:56:41 crc kubenswrapper[4921]: I0318 13:56:41.395933 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-n7ftq" podStartSLOduration=3.8197196 podStartE2EDuration="8.395913965s" podCreationTimestamp="2026-03-18 13:56:33 +0000 UTC" firstStartedPulling="2026-03-18 13:56:35.252121176 +0000 UTC m=+6414.802041815" lastFinishedPulling="2026-03-18 13:56:39.828315541 +0000 UTC m=+6419.378236180" observedRunningTime="2026-03-18 13:56:41.38037249 +0000 UTC m=+6420.930293149" watchObservedRunningTime="2026-03-18 13:56:41.395913965 +0000 UTC m=+6420.945834594" Mar 18 13:56:42 crc kubenswrapper[4921]: I0318 13:56:42.209606 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:56:42 crc kubenswrapper[4921]: E0318 13:56:42.209862 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:56:42 crc kubenswrapper[4921]: I0318 13:56:42.373477 4921 generic.go:334] "Generic (PLEG): container finished" podID="b0baadf0-64a5-45b5-9e26-491b274ea3d4" containerID="d89abc3c68ec704d1323a0ab3498afe2d3b8133247a30f47efe56be42b09929a" exitCode=0 Mar 18 13:56:42 crc kubenswrapper[4921]: I0318 13:56:42.373556 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-n7ftq" event={"ID":"b0baadf0-64a5-45b5-9e26-491b274ea3d4","Type":"ContainerDied","Data":"d89abc3c68ec704d1323a0ab3498afe2d3b8133247a30f47efe56be42b09929a"} Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.882179 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.959938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-combined-ca-bundle\") pod \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.960015 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-job-config-data\") pod \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.960064 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fccm\" (UniqueName: \"kubernetes.io/projected/b0baadf0-64a5-45b5-9e26-491b274ea3d4-kube-api-access-2fccm\") pod \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.960240 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-config-data\") pod \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\" (UID: \"b0baadf0-64a5-45b5-9e26-491b274ea3d4\") " Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.965520 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0baadf0-64a5-45b5-9e26-491b274ea3d4-kube-api-access-2fccm" (OuterVolumeSpecName: "kube-api-access-2fccm") pod "b0baadf0-64a5-45b5-9e26-491b274ea3d4" (UID: "b0baadf0-64a5-45b5-9e26-491b274ea3d4"). InnerVolumeSpecName "kube-api-access-2fccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.967043 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "b0baadf0-64a5-45b5-9e26-491b274ea3d4" (UID: "b0baadf0-64a5-45b5-9e26-491b274ea3d4"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.967814 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-config-data" (OuterVolumeSpecName: "config-data") pod "b0baadf0-64a5-45b5-9e26-491b274ea3d4" (UID: "b0baadf0-64a5-45b5-9e26-491b274ea3d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:43 crc kubenswrapper[4921]: I0318 13:56:43.990218 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0baadf0-64a5-45b5-9e26-491b274ea3d4" (UID: "b0baadf0-64a5-45b5-9e26-491b274ea3d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.062892 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.062926 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.062937 4921 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/b0baadf0-64a5-45b5-9e26-491b274ea3d4-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.062947 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fccm\" (UniqueName: \"kubernetes.io/projected/b0baadf0-64a5-45b5-9e26-491b274ea3d4-kube-api-access-2fccm\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.392500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-n7ftq" event={"ID":"b0baadf0-64a5-45b5-9e26-491b274ea3d4","Type":"ContainerDied","Data":"b6a0578aa08f0e9858b021d4d66620f0fce386c45283cdeab20e9366b7561a6d"} Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.392536 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-n7ftq" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.392574 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6a0578aa08f0e9858b021d4d66620f0fce386c45283cdeab20e9366b7561a6d" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.657361 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 13:56:44 crc kubenswrapper[4921]: E0318 13:56:44.658184 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0baadf0-64a5-45b5-9e26-491b274ea3d4" containerName="manila-db-sync" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.658204 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0baadf0-64a5-45b5-9e26-491b274ea3d4" containerName="manila-db-sync" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.658485 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0baadf0-64a5-45b5-9e26-491b274ea3d4" containerName="manila-db-sync" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.659903 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.666219 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.666442 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-plh59" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.667175 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.667343 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.677248 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.679024 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.682174 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.692486 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.734939 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776469 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776556 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-config-data\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776582 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-scripts\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvflx\" (UniqueName: \"kubernetes.io/projected/806616a6-2ba4-41b2-80ee-369a45cb1447-kube-api-access-dvflx\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776680 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/806616a6-2ba4-41b2-80ee-369a45cb1447-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776762 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-config-data\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776784 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/806616a6-2ba4-41b2-80ee-369a45cb1447-ceph\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776824 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776898 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-scripts\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776935 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.776969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cn69\" (UniqueName: \"kubernetes.io/projected/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-kube-api-access-4cn69\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.777006 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.777068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.777226 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/806616a6-2ba4-41b2-80ee-369a45cb1447-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.836350 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-tvx7d"] Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.840686 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.852901 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-tvx7d"] Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879081 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879183 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/806616a6-2ba4-41b2-80ee-369a45cb1447-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879233 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-scripts\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879310 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-config-data\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879365 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvflx\" (UniqueName: \"kubernetes.io/projected/806616a6-2ba4-41b2-80ee-369a45cb1447-kube-api-access-dvflx\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/806616a6-2ba4-41b2-80ee-369a45cb1447-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879420 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-config-data\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879438 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/806616a6-2ba4-41b2-80ee-369a45cb1447-ceph\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879508 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-scripts\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879578 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cn69\" (UniqueName: \"kubernetes.io/projected/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-kube-api-access-4cn69\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.880471 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.883454 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.883588 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/806616a6-2ba4-41b2-80ee-369a45cb1447-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.879373 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/806616a6-2ba4-41b2-80ee-369a45cb1447-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.884691 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.885569 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-config-data\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.885788 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/806616a6-2ba4-41b2-80ee-369a45cb1447-ceph\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.885827 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-scripts\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.892171 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.894250 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-config-data\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.898250 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-scripts\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.899096 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cn69\" (UniqueName: \"kubernetes.io/projected/94baf96e-4d22-40a6-a7d9-f04eb6f694b7-kube-api-access-4cn69\") pod \"manila-scheduler-0\" (UID: \"94baf96e-4d22-40a6-a7d9-f04eb6f694b7\") " pod="openstack/manila-scheduler-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.901020 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806616a6-2ba4-41b2-80ee-369a45cb1447-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.903285 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvflx\" (UniqueName: \"kubernetes.io/projected/806616a6-2ba4-41b2-80ee-369a45cb1447-kube-api-access-dvflx\") pod \"manila-share-share1-0\" (UID: \"806616a6-2ba4-41b2-80ee-369a45cb1447\") " pod="openstack/manila-share-share1-0" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.981687 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-config\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.981747 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.981805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.981913 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-dns-svc\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:44 crc kubenswrapper[4921]: I0318 13:56:44.981943 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7r8\" (UniqueName: \"kubernetes.io/projected/f178f177-b38a-4e0c-b37b-4f00f9f9c853-kube-api-access-hg7r8\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.007013 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.022592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.088785 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-config\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.088847 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.088896 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.088983 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-dns-svc\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.089012 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7r8\" (UniqueName: \"kubernetes.io/projected/f178f177-b38a-4e0c-b37b-4f00f9f9c853-kube-api-access-hg7r8\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.090528 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.090923 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.091070 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-config\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.091445 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-dns-svc\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.118003 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7r8\" (UniqueName: \"kubernetes.io/projected/f178f177-b38a-4e0c-b37b-4f00f9f9c853-kube-api-access-hg7r8\") pod \"dnsmasq-dns-7876bb76fc-tvx7d\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.171450 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.172656 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.173187 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.182356 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.270790 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.295239 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spw52\" (UniqueName: \"kubernetes.io/projected/0485f76e-5cf0-460f-9dd7-ffa89b64455e-kube-api-access-spw52\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.295486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-config-data-custom\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.295580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-config-data\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.314290 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0485f76e-5cf0-460f-9dd7-ffa89b64455e-etc-machine-id\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.314365 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.314587 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-scripts\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.314659 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0485f76e-5cf0-460f-9dd7-ffa89b64455e-logs\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.417745 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.417869 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-scripts\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.417927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0485f76e-5cf0-460f-9dd7-ffa89b64455e-logs\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.417981 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spw52\" (UniqueName: \"kubernetes.io/projected/0485f76e-5cf0-460f-9dd7-ffa89b64455e-kube-api-access-spw52\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.418020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-config-data-custom\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.418053 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-config-data\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.418073 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0485f76e-5cf0-460f-9dd7-ffa89b64455e-etc-machine-id\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.418167 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0485f76e-5cf0-460f-9dd7-ffa89b64455e-etc-machine-id\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.419892 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0485f76e-5cf0-460f-9dd7-ffa89b64455e-logs\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.430053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-config-data\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.432798 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.433057 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-scripts\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.433748 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0485f76e-5cf0-460f-9dd7-ffa89b64455e-config-data-custom\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.449749 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spw52\" (UniqueName: \"kubernetes.io/projected/0485f76e-5cf0-460f-9dd7-ffa89b64455e-kube-api-access-spw52\") pod \"manila-api-0\" (UID: \"0485f76e-5cf0-460f-9dd7-ffa89b64455e\") " pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.544137 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 18 13:56:45 crc kubenswrapper[4921]: I0318 13:56:45.827192 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.010458 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-tvx7d"] Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.107360 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 18 13:56:46 crc kubenswrapper[4921]: W0318 13:56:46.115858 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod806616a6_2ba4_41b2_80ee_369a45cb1447.slice/crio-526fed010d9aef3c1eb287a3c835a22f2ba4809b14f7a6495ff7dceea55d4d37 WatchSource:0}: Error finding container 526fed010d9aef3c1eb287a3c835a22f2ba4809b14f7a6495ff7dceea55d4d37: Status 404 returned error can't find the container with id 526fed010d9aef3c1eb287a3c835a22f2ba4809b14f7a6495ff7dceea55d4d37 Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.348677 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.436219 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"806616a6-2ba4-41b2-80ee-369a45cb1447","Type":"ContainerStarted","Data":"526fed010d9aef3c1eb287a3c835a22f2ba4809b14f7a6495ff7dceea55d4d37"} Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.437895 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0485f76e-5cf0-460f-9dd7-ffa89b64455e","Type":"ContainerStarted","Data":"984fd3765b48f615c1edc1c41baddb796947a1e4bcce1ab9bfcac902a91d6042"} Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.446874 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"94baf96e-4d22-40a6-a7d9-f04eb6f694b7","Type":"ContainerStarted","Data":"6f4b6f6e047625a3a842c5fa3ea30c57205c654fec7e4aa282f03bc4f8c481bc"} Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.450800 4921 generic.go:334] "Generic (PLEG): container finished" podID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerID="263607671da57cf288f9aa846b77ab3d207f7ec33e6d4e60969389e66e711076" exitCode=0 Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.450827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" event={"ID":"f178f177-b38a-4e0c-b37b-4f00f9f9c853","Type":"ContainerDied","Data":"263607671da57cf288f9aa846b77ab3d207f7ec33e6d4e60969389e66e711076"} Mar 18 13:56:46 crc kubenswrapper[4921]: I0318 13:56:46.450843 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" event={"ID":"f178f177-b38a-4e0c-b37b-4f00f9f9c853","Type":"ContainerStarted","Data":"b6172135f3ac7cac3ae030846a40c98e0a36120f1d3a7003dd09d69b26eece62"} Mar 18 13:56:47 crc kubenswrapper[4921]: I0318 13:56:47.511198 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0485f76e-5cf0-460f-9dd7-ffa89b64455e","Type":"ContainerStarted","Data":"c2fdfbb956d995677dd0187f13a26e919b0f932bdcd65acdb636eeb92c3a7329"} Mar 18 13:56:47 crc kubenswrapper[4921]: I0318 13:56:47.513872 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"94baf96e-4d22-40a6-a7d9-f04eb6f694b7","Type":"ContainerStarted","Data":"c1060123d108ef88dee2ca0b6ca16ec2c42ecdc976e7a7ae0a0600913126b7c5"} Mar 18 13:56:47 crc kubenswrapper[4921]: I0318 13:56:47.517590 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" event={"ID":"f178f177-b38a-4e0c-b37b-4f00f9f9c853","Type":"ContainerStarted","Data":"7bc26aa5afec8bcbf317f52dadfe253f86e8681e072c84fbb01512fbf97b9cd0"} Mar 18 13:56:47 crc kubenswrapper[4921]: I0318 13:56:47.517695 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:47 crc kubenswrapper[4921]: I0318 13:56:47.535538 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" podStartSLOduration=3.535522666 podStartE2EDuration="3.535522666s" podCreationTimestamp="2026-03-18 13:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:56:47.532311034 +0000 UTC m=+6427.082231673" watchObservedRunningTime="2026-03-18 13:56:47.535522666 +0000 UTC m=+6427.085443305" Mar 18 13:56:48 crc kubenswrapper[4921]: I0318 13:56:48.533385 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"94baf96e-4d22-40a6-a7d9-f04eb6f694b7","Type":"ContainerStarted","Data":"b973f8e332c8adcf9ad5bf751eace6355c6f8d959a72b95ef019c99125bfc71e"} Mar 18 13:56:48 crc kubenswrapper[4921]: I0318 13:56:48.538245 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0485f76e-5cf0-460f-9dd7-ffa89b64455e","Type":"ContainerStarted","Data":"2c3e629364a1958e91dbf906bcf52517d8201279ef8e04ba610505d1e33cba17"} Mar 18 13:56:48 crc kubenswrapper[4921]: I0318 13:56:48.538536 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 18 13:56:48 crc kubenswrapper[4921]: I0318 13:56:48.558619 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.798280628 podStartE2EDuration="4.558596052s" podCreationTimestamp="2026-03-18 13:56:44 +0000 UTC" firstStartedPulling="2026-03-18 13:56:45.848013254 +0000 UTC m=+6425.397933893" lastFinishedPulling="2026-03-18 13:56:46.608328678 +0000 UTC m=+6426.158249317" observedRunningTime="2026-03-18 13:56:48.548563964 +0000 UTC m=+6428.098484603" watchObservedRunningTime="2026-03-18 13:56:48.558596052 +0000 UTC m=+6428.108516681" Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.326637 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.326613072 podStartE2EDuration="5.326613072s" podCreationTimestamp="2026-03-18 13:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:56:48.589967471 +0000 UTC m=+6428.139888110" watchObservedRunningTime="2026-03-18 13:56:50.326613072 +0000 UTC m=+6429.876533711" Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.329222 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.331600 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-central-agent" containerID="cri-o://08840d8766598cc69601fc00b78e28da71c3db9af550624bd98ce5c049bde9da" gracePeriod=30 Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.332607 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="proxy-httpd" containerID="cri-o://2940d151e8024eb0a893ccf7761d3fac67bc867c50a3c28e13d83f4284a7d192" gracePeriod=30 Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.332671 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="sg-core" containerID="cri-o://6fd78cbe4c63ded946497591c1848e63068677d19e5018c19e326e2eac05f7b2" gracePeriod=30 Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.332722 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-notification-agent" containerID="cri-o://9d4f589e5a1cf0b2a5b947d09f7c05231dc5c5834d3a492674a7d6f100a42566" gracePeriod=30 Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.342439 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.174:3000/\": EOF" Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.499530 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.174:3000/\": dial tcp 10.217.1.174:3000: connect: connection refused" Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.578669 4921 generic.go:334] "Generic (PLEG): container finished" podID="171a2627-97a7-40a9-b604-1528694f1de0" containerID="2940d151e8024eb0a893ccf7761d3fac67bc867c50a3c28e13d83f4284a7d192" exitCode=0 Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.578708 4921 generic.go:334] "Generic (PLEG): container finished" podID="171a2627-97a7-40a9-b604-1528694f1de0" containerID="6fd78cbe4c63ded946497591c1848e63068677d19e5018c19e326e2eac05f7b2" exitCode=2 Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.578730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerDied","Data":"2940d151e8024eb0a893ccf7761d3fac67bc867c50a3c28e13d83f4284a7d192"} Mar 18 13:56:50 crc kubenswrapper[4921]: I0318 13:56:50.578755 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerDied","Data":"6fd78cbe4c63ded946497591c1848e63068677d19e5018c19e326e2eac05f7b2"} Mar 18 13:56:51 crc kubenswrapper[4921]: I0318 13:56:51.592716 4921 generic.go:334] "Generic (PLEG): container finished" podID="171a2627-97a7-40a9-b604-1528694f1de0" containerID="08840d8766598cc69601fc00b78e28da71c3db9af550624bd98ce5c049bde9da" exitCode=0 Mar 18 13:56:51 crc kubenswrapper[4921]: I0318 13:56:51.593023 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerDied","Data":"08840d8766598cc69601fc00b78e28da71c3db9af550624bd98ce5c049bde9da"} Mar 18 13:56:52 crc kubenswrapper[4921]: I0318 13:56:52.606823 4921 generic.go:334] "Generic (PLEG): container finished" podID="171a2627-97a7-40a9-b604-1528694f1de0" containerID="9d4f589e5a1cf0b2a5b947d09f7c05231dc5c5834d3a492674a7d6f100a42566" exitCode=0 Mar 18 13:56:52 crc kubenswrapper[4921]: I0318 13:56:52.607126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerDied","Data":"9d4f589e5a1cf0b2a5b947d09f7c05231dc5c5834d3a492674a7d6f100a42566"} Mar 18 13:56:54 crc kubenswrapper[4921]: I0318 13:56:54.963732 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.007266 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101254 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-config-data\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101314 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-combined-ca-bundle\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101344 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-run-httpd\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101381 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-log-httpd\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101556 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-scripts\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101598 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-sg-core-conf-yaml\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101669 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4xtq\" (UniqueName: \"kubernetes.io/projected/171a2627-97a7-40a9-b604-1528694f1de0-kube-api-access-f4xtq\") pod \"171a2627-97a7-40a9-b604-1528694f1de0\" (UID: \"171a2627-97a7-40a9-b604-1528694f1de0\") " Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.101731 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.102165 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.103960 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.104702 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-scripts" (OuterVolumeSpecName: "scripts") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.109675 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171a2627-97a7-40a9-b604-1528694f1de0-kube-api-access-f4xtq" (OuterVolumeSpecName: "kube-api-access-f4xtq") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "kube-api-access-f4xtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.135599 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.178312 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.207104 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/171a2627-97a7-40a9-b604-1528694f1de0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.207184 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.207195 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.207209 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4xtq\" (UniqueName: \"kubernetes.io/projected/171a2627-97a7-40a9-b604-1528694f1de0-kube-api-access-f4xtq\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.238295 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.248314 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-config-data" (OuterVolumeSpecName: "config-data") pod "171a2627-97a7-40a9-b604-1528694f1de0" (UID: "171a2627-97a7-40a9-b604-1528694f1de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.263416 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-6ptd4"] Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.263704 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerName="dnsmasq-dns" containerID="cri-o://a18f118f91380fc2fea0bbb5c3cf426cc0271e62c033562955ecf8180e67d122" gracePeriod=10 Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.311839 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.311866 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/171a2627-97a7-40a9-b604-1528694f1de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.651933 4921 generic.go:334] "Generic (PLEG): container finished" podID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerID="a18f118f91380fc2fea0bbb5c3cf426cc0271e62c033562955ecf8180e67d122" exitCode=0 Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.652128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" event={"ID":"d95b901e-6bff-439c-b1fb-75ff8939262a","Type":"ContainerDied","Data":"a18f118f91380fc2fea0bbb5c3cf426cc0271e62c033562955ecf8180e67d122"} Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.655373 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"171a2627-97a7-40a9-b604-1528694f1de0","Type":"ContainerDied","Data":"c8dcc370c79ec34f316c200ed812946f7dad105ba6473054f66df3ea31c5ed7d"} Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.655426 4921 scope.go:117] "RemoveContainer" containerID="2940d151e8024eb0a893ccf7761d3fac67bc867c50a3c28e13d83f4284a7d192" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.655605 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.661377 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"806616a6-2ba4-41b2-80ee-369a45cb1447","Type":"ContainerStarted","Data":"f52d5c2ff9efb8886623cd43f62ca2cf73187bbb8c168c4458485e2970e5d8f4"} Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.696306 4921 scope.go:117] "RemoveContainer" containerID="6fd78cbe4c63ded946497591c1848e63068677d19e5018c19e326e2eac05f7b2" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.702658 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.738100 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.759348 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:55 crc kubenswrapper[4921]: E0318 13:56:55.759878 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-central-agent" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.759900 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-central-agent" Mar 18 13:56:55 crc kubenswrapper[4921]: E0318 13:56:55.759926 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-notification-agent" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.759933 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-notification-agent" Mar 18 13:56:55 crc kubenswrapper[4921]: E0318 13:56:55.759959 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="sg-core" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.759966 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="sg-core" Mar 18 13:56:55 crc kubenswrapper[4921]: E0318 13:56:55.759978 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="proxy-httpd" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.759984 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="proxy-httpd" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.760210 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-central-agent" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.760231 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="sg-core" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.760242 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="ceilometer-notification-agent" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.760258 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="171a2627-97a7-40a9-b604-1528694f1de0" containerName="proxy-httpd" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.760723 4921 scope.go:117] "RemoveContainer" containerID="9d4f589e5a1cf0b2a5b947d09f7c05231dc5c5834d3a492674a7d6f100a42566" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.762865 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.765367 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.765738 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.771260 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.791756 4921 scope.go:117] "RemoveContainer" containerID="08840d8766598cc69601fc00b78e28da71c3db9af550624bd98ce5c049bde9da" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.925649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.926025 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.926049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-scripts\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.926102 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-run-httpd\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.926156 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkr42\" (UniqueName: \"kubernetes.io/projected/88907204-bd6a-4376-8351-674066a7f122-kube-api-access-lkr42\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.926206 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-config-data\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.926271 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-log-httpd\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:55 crc kubenswrapper[4921]: I0318 13:56:55.969237 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkr42\" (UniqueName: \"kubernetes.io/projected/88907204-bd6a-4376-8351-674066a7f122-kube-api-access-lkr42\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028382 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-config-data\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028429 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-log-httpd\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028472 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028560 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-scripts\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.028581 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-run-httpd\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.029057 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-run-httpd\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.029296 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-log-httpd\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.038064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.038251 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.040526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-config-data\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.043931 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-scripts\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.044725 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkr42\" (UniqueName: \"kubernetes.io/projected/88907204-bd6a-4376-8351-674066a7f122-kube-api-access-lkr42\") pod \"ceilometer-0\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.091239 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.130121 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lffg\" (UniqueName: \"kubernetes.io/projected/d95b901e-6bff-439c-b1fb-75ff8939262a-kube-api-access-8lffg\") pod \"d95b901e-6bff-439c-b1fb-75ff8939262a\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.130280 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-config\") pod \"d95b901e-6bff-439c-b1fb-75ff8939262a\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.130331 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-nb\") pod \"d95b901e-6bff-439c-b1fb-75ff8939262a\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.130453 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-sb\") pod \"d95b901e-6bff-439c-b1fb-75ff8939262a\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.130496 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-dns-svc\") pod \"d95b901e-6bff-439c-b1fb-75ff8939262a\" (UID: \"d95b901e-6bff-439c-b1fb-75ff8939262a\") " Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.134952 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95b901e-6bff-439c-b1fb-75ff8939262a-kube-api-access-8lffg" (OuterVolumeSpecName: "kube-api-access-8lffg") pod "d95b901e-6bff-439c-b1fb-75ff8939262a" (UID: "d95b901e-6bff-439c-b1fb-75ff8939262a"). InnerVolumeSpecName "kube-api-access-8lffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.196579 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-config" (OuterVolumeSpecName: "config") pod "d95b901e-6bff-439c-b1fb-75ff8939262a" (UID: "d95b901e-6bff-439c-b1fb-75ff8939262a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.197580 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d95b901e-6bff-439c-b1fb-75ff8939262a" (UID: "d95b901e-6bff-439c-b1fb-75ff8939262a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.197602 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d95b901e-6bff-439c-b1fb-75ff8939262a" (UID: "d95b901e-6bff-439c-b1fb-75ff8939262a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.213665 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.214587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d95b901e-6bff-439c-b1fb-75ff8939262a" (UID: "d95b901e-6bff-439c-b1fb-75ff8939262a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:56:56 crc kubenswrapper[4921]: E0318 13:56:56.221636 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.233762 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.233794 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lffg\" (UniqueName: \"kubernetes.io/projected/d95b901e-6bff-439c-b1fb-75ff8939262a-kube-api-access-8lffg\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.233805 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.233816 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.233824 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d95b901e-6bff-439c-b1fb-75ff8939262a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.591942 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:56 crc kubenswrapper[4921]: W0318 13:56:56.600238 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88907204_bd6a_4376_8351_674066a7f122.slice/crio-400056d75afdb52f42b0bffb15ebbc2a0c1d62ce9e0a8ecb197698b895c8f0d4 WatchSource:0}: Error finding container 400056d75afdb52f42b0bffb15ebbc2a0c1d62ce9e0a8ecb197698b895c8f0d4: Status 404 returned error can't find the container with id 400056d75afdb52f42b0bffb15ebbc2a0c1d62ce9e0a8ecb197698b895c8f0d4 Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.671520 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"806616a6-2ba4-41b2-80ee-369a45cb1447","Type":"ContainerStarted","Data":"0434f5ef800d61f9be236721eaf35a39f265630b2ef94b573555f07b606c8014"} Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.673478 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" event={"ID":"d95b901e-6bff-439c-b1fb-75ff8939262a","Type":"ContainerDied","Data":"c7199ce81394c61aabffac796c7687a9694f01de2597bd53c19d31670f293418"} Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.673500 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7784748f7f-6ptd4" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.673515 4921 scope.go:117] "RemoveContainer" containerID="a18f118f91380fc2fea0bbb5c3cf426cc0271e62c033562955ecf8180e67d122" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.676484 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerStarted","Data":"400056d75afdb52f42b0bffb15ebbc2a0c1d62ce9e0a8ecb197698b895c8f0d4"} Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.700165 4921 scope.go:117] "RemoveContainer" containerID="0b99892b1d3e3450c6803a159ccac3e7b8f18dd5d2e90fa9ca36890901e5b987" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.700444 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.244150968 podStartE2EDuration="12.700427495s" podCreationTimestamp="2026-03-18 13:56:44 +0000 UTC" firstStartedPulling="2026-03-18 13:56:46.129690188 +0000 UTC m=+6425.679610827" lastFinishedPulling="2026-03-18 13:56:54.585966715 +0000 UTC m=+6434.135887354" observedRunningTime="2026-03-18 13:56:56.696259725 +0000 UTC m=+6436.246180364" watchObservedRunningTime="2026-03-18 13:56:56.700427495 +0000 UTC m=+6436.250348134" Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.734203 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-6ptd4"] Mar 18 13:56:56 crc kubenswrapper[4921]: I0318 13:56:56.747253 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7784748f7f-6ptd4"] Mar 18 13:56:57 crc kubenswrapper[4921]: I0318 13:56:57.251338 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171a2627-97a7-40a9-b604-1528694f1de0" path="/var/lib/kubelet/pods/171a2627-97a7-40a9-b604-1528694f1de0/volumes" Mar 18 13:56:57 crc kubenswrapper[4921]: I0318 13:56:57.252532 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" path="/var/lib/kubelet/pods/d95b901e-6bff-439c-b1fb-75ff8939262a/volumes" Mar 18 13:56:57 crc kubenswrapper[4921]: I0318 13:56:57.690519 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerStarted","Data":"e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f"} Mar 18 13:56:58 crc kubenswrapper[4921]: I0318 13:56:58.708685 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerStarted","Data":"1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba"} Mar 18 13:56:59 crc kubenswrapper[4921]: I0318 13:56:59.383734 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:56:59 crc kubenswrapper[4921]: I0318 13:56:59.728705 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerStarted","Data":"a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4"} Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.749619 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerStarted","Data":"a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f"} Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.750362 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.749905 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="proxy-httpd" containerID="cri-o://a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f" gracePeriod=30 Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.749755 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-central-agent" containerID="cri-o://e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f" gracePeriod=30 Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.749842 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="sg-core" containerID="cri-o://a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4" gracePeriod=30 Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.749914 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-notification-agent" containerID="cri-o://1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba" gracePeriod=30 Mar 18 13:57:01 crc kubenswrapper[4921]: I0318 13:57:01.781696 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.326879722 podStartE2EDuration="6.781671548s" podCreationTimestamp="2026-03-18 13:56:55 +0000 UTC" firstStartedPulling="2026-03-18 13:56:56.605343199 +0000 UTC m=+6436.155263838" lastFinishedPulling="2026-03-18 13:57:01.060135025 +0000 UTC m=+6440.610055664" observedRunningTime="2026-03-18 13:57:01.772802623 +0000 UTC m=+6441.322723282" watchObservedRunningTime="2026-03-18 13:57:01.781671548 +0000 UTC m=+6441.331592187" Mar 18 13:57:02 crc kubenswrapper[4921]: E0318 13:57:02.159100 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88907204_bd6a_4376_8351_674066a7f122.slice/crio-conmon-a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:57:02 crc kubenswrapper[4921]: I0318 13:57:02.760775 4921 generic.go:334] "Generic (PLEG): container finished" podID="88907204-bd6a-4376-8351-674066a7f122" containerID="a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f" exitCode=0 Mar 18 13:57:02 crc kubenswrapper[4921]: I0318 13:57:02.761064 4921 generic.go:334] "Generic (PLEG): container finished" podID="88907204-bd6a-4376-8351-674066a7f122" containerID="a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4" exitCode=2 Mar 18 13:57:02 crc kubenswrapper[4921]: I0318 13:57:02.761076 4921 generic.go:334] "Generic (PLEG): container finished" podID="88907204-bd6a-4376-8351-674066a7f122" containerID="1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba" exitCode=0 Mar 18 13:57:02 crc kubenswrapper[4921]: I0318 13:57:02.761100 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerDied","Data":"a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f"} Mar 18 13:57:02 crc kubenswrapper[4921]: I0318 13:57:02.761153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerDied","Data":"a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4"} Mar 18 13:57:02 crc kubenswrapper[4921]: I0318 13:57:02.761166 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerDied","Data":"1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba"} Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.683341 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.742619 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-sg-core-conf-yaml\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.742692 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkr42\" (UniqueName: \"kubernetes.io/projected/88907204-bd6a-4376-8351-674066a7f122-kube-api-access-lkr42\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.742736 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-run-httpd\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.742854 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-log-httpd\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.742932 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-config-data\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.742979 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-combined-ca-bundle\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.743033 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-scripts\") pod \"88907204-bd6a-4376-8351-674066a7f122\" (UID: \"88907204-bd6a-4376-8351-674066a7f122\") " Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.743688 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.744617 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.744813 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.749522 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-scripts" (OuterVolumeSpecName: "scripts") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.751978 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88907204-bd6a-4376-8351-674066a7f122-kube-api-access-lkr42" (OuterVolumeSpecName: "kube-api-access-lkr42") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "kube-api-access-lkr42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.773764 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.784651 4921 generic.go:334] "Generic (PLEG): container finished" podID="88907204-bd6a-4376-8351-674066a7f122" containerID="e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f" exitCode=0 Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.784704 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.784720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerDied","Data":"e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f"} Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.785043 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88907204-bd6a-4376-8351-674066a7f122","Type":"ContainerDied","Data":"400056d75afdb52f42b0bffb15ebbc2a0c1d62ce9e0a8ecb197698b895c8f0d4"} Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.785070 4921 scope.go:117] "RemoveContainer" containerID="a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.836454 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.846275 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.846931 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkr42\" (UniqueName: \"kubernetes.io/projected/88907204-bd6a-4376-8351-674066a7f122-kube-api-access-lkr42\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.847005 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88907204-bd6a-4376-8351-674066a7f122-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.847070 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.847148 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.863127 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-config-data" (OuterVolumeSpecName: "config-data") pod "88907204-bd6a-4376-8351-674066a7f122" (UID: "88907204-bd6a-4376-8351-674066a7f122"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.949925 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88907204-bd6a-4376-8351-674066a7f122-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.955543 4921 scope.go:117] "RemoveContainer" containerID="a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.972497 4921 scope.go:117] "RemoveContainer" containerID="1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba" Mar 18 13:57:04 crc kubenswrapper[4921]: I0318 13:57:04.991842 4921 scope.go:117] "RemoveContainer" containerID="e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.022829 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.032271 4921 scope.go:117] "RemoveContainer" containerID="a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.032793 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f\": container with ID starting with a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f not found: ID does not exist" containerID="a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.032825 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f"} err="failed to get container status \"a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f\": rpc error: code = NotFound desc = could not find container \"a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f\": container with ID starting with a413ed8ddf8d28b21782fc483c198cf4694adaa697e21613369980ccdcff221f not found: ID does not exist" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.032849 4921 scope.go:117] "RemoveContainer" containerID="a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.033219 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4\": container with ID starting with a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4 not found: ID does not exist" containerID="a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.033244 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4"} err="failed to get container status \"a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4\": rpc error: code = NotFound desc = could not find container \"a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4\": container with ID starting with a733f4b7ae027a04d8ef6885f14918a99bb8452e3abee27d4ef50b5324c930a4 not found: ID does not exist" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.033258 4921 scope.go:117] "RemoveContainer" containerID="1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.033523 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba\": container with ID starting with 1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba not found: ID does not exist" containerID="1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.033554 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba"} err="failed to get container status \"1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba\": rpc error: code = NotFound desc = could not find container \"1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba\": container with ID starting with 1999d586405084a50ab47ff7428548244f275c41ef620f877b71aa5fdb7a13ba not found: ID does not exist" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.033568 4921 scope.go:117] "RemoveContainer" containerID="e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.033917 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f\": container with ID starting with e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f not found: ID does not exist" containerID="e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.033938 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f"} err="failed to get container status \"e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f\": rpc error: code = NotFound desc = could not find container \"e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f\": container with ID starting with e7bbab0197500d79144dd0f7a2f86c617402516c4e28da6c3c14085c8ac4123f not found: ID does not exist" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.149667 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.182582 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.235981 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88907204-bd6a-4376-8351-674066a7f122" path="/var/lib/kubelet/pods/88907204-bd6a-4376-8351-674066a7f122/volumes" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.237615 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.238008 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-central-agent" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.238023 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-central-agent" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.238045 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-notification-agent" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.238053 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-notification-agent" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.238077 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerName="init" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.238085 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerName="init" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.238101 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerName="dnsmasq-dns" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.238909 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerName="dnsmasq-dns" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.238994 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="proxy-httpd" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239012 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="proxy-httpd" Mar 18 13:57:05 crc kubenswrapper[4921]: E0318 13:57:05.239055 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="sg-core" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239065 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="sg-core" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239422 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="proxy-httpd" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239475 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-central-agent" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239500 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="sg-core" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239517 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95b901e-6bff-439c-b1fb-75ff8939262a" containerName="dnsmasq-dns" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.239556 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88907204-bd6a-4376-8351-674066a7f122" containerName="ceilometer-notification-agent" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.241860 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.241950 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.245404 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.245552 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.359230 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-config-data\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.359347 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c29011d7-f9b6-4594-9817-af2780632e82-log-httpd\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.359774 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.359870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c29011d7-f9b6-4594-9817-af2780632e82-run-httpd\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.359934 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.359977 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wfkj\" (UniqueName: \"kubernetes.io/projected/c29011d7-f9b6-4594-9817-af2780632e82-kube-api-access-5wfkj\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.360097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-scripts\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.461897 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c29011d7-f9b6-4594-9817-af2780632e82-log-httpd\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.462084 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.462145 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c29011d7-f9b6-4594-9817-af2780632e82-run-httpd\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.462183 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.462219 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wfkj\" (UniqueName: \"kubernetes.io/projected/c29011d7-f9b6-4594-9817-af2780632e82-kube-api-access-5wfkj\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.462286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-scripts\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.462318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-config-data\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.463186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c29011d7-f9b6-4594-9817-af2780632e82-run-httpd\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.464982 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c29011d7-f9b6-4594-9817-af2780632e82-log-httpd\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.467329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.467712 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.467827 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-config-data\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.472748 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29011d7-f9b6-4594-9817-af2780632e82-scripts\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.488140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wfkj\" (UniqueName: \"kubernetes.io/projected/c29011d7-f9b6-4594-9817-af2780632e82-kube-api-access-5wfkj\") pod \"ceilometer-0\" (UID: \"c29011d7-f9b6-4594-9817-af2780632e82\") " pod="openstack/ceilometer-0" Mar 18 13:57:05 crc kubenswrapper[4921]: I0318 13:57:05.570558 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 13:57:06 crc kubenswrapper[4921]: I0318 13:57:06.117933 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 13:57:06 crc kubenswrapper[4921]: W0318 13:57:06.121431 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc29011d7_f9b6_4594_9817_af2780632e82.slice/crio-f035894e96c6d79fcf9807f1a389f1c95394b38a317550b8dbc18f2e9d3cfeeb WatchSource:0}: Error finding container f035894e96c6d79fcf9807f1a389f1c95394b38a317550b8dbc18f2e9d3cfeeb: Status 404 returned error can't find the container with id f035894e96c6d79fcf9807f1a389f1c95394b38a317550b8dbc18f2e9d3cfeeb Mar 18 13:57:06 crc kubenswrapper[4921]: I0318 13:57:06.722781 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 18 13:57:06 crc kubenswrapper[4921]: I0318 13:57:06.796293 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 18 13:57:06 crc kubenswrapper[4921]: I0318 13:57:06.823563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c29011d7-f9b6-4594-9817-af2780632e82","Type":"ContainerStarted","Data":"f035894e96c6d79fcf9807f1a389f1c95394b38a317550b8dbc18f2e9d3cfeeb"} Mar 18 13:57:07 crc kubenswrapper[4921]: I0318 13:57:07.155039 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 18 13:57:07 crc kubenswrapper[4921]: I0318 13:57:07.834241 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c29011d7-f9b6-4594-9817-af2780632e82","Type":"ContainerStarted","Data":"e67c500ddceacdf4fa6ff817070be589b0a4bbdefede20f5a5fc897ce0f97de1"} Mar 18 13:57:07 crc kubenswrapper[4921]: I0318 13:57:07.834585 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c29011d7-f9b6-4594-9817-af2780632e82","Type":"ContainerStarted","Data":"06f42158081aa89707107e3b668da5d3d58781e80c33d5c6add3de671f03669b"} Mar 18 13:57:08 crc kubenswrapper[4921]: I0318 13:57:08.209841 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:57:08 crc kubenswrapper[4921]: E0318 13:57:08.210182 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:57:08 crc kubenswrapper[4921]: I0318 13:57:08.859695 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c29011d7-f9b6-4594-9817-af2780632e82","Type":"ContainerStarted","Data":"9841bdd888e4eecefce7028710e83c7c8bb9ddc1b98878ea205e50dde3cf1658"} Mar 18 13:57:11 crc kubenswrapper[4921]: I0318 13:57:11.893568 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c29011d7-f9b6-4594-9817-af2780632e82","Type":"ContainerStarted","Data":"48d99719cc8333a9b54fa22addfab77c5b26afca9a9a2d509bd708cdd4c20b1e"} Mar 18 13:57:11 crc kubenswrapper[4921]: I0318 13:57:11.895228 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 13:57:11 crc kubenswrapper[4921]: I0318 13:57:11.922148 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.906444158 podStartE2EDuration="6.922097541s" podCreationTimestamp="2026-03-18 13:57:05 +0000 UTC" firstStartedPulling="2026-03-18 13:57:06.128046365 +0000 UTC m=+6445.677967004" lastFinishedPulling="2026-03-18 13:57:11.143699738 +0000 UTC m=+6450.693620387" observedRunningTime="2026-03-18 13:57:11.914230965 +0000 UTC m=+6451.464151604" watchObservedRunningTime="2026-03-18 13:57:11.922097541 +0000 UTC m=+6451.472018180" Mar 18 13:57:13 crc kubenswrapper[4921]: I0318 13:57:13.547167 4921 scope.go:117] "RemoveContainer" containerID="59247b393de7d8e73cf98fe0ed7ac2c7d42cfed7291ccdc8306ce7991128b10e" Mar 18 13:57:13 crc kubenswrapper[4921]: I0318 13:57:13.621444 4921 scope.go:117] "RemoveContainer" containerID="3a84fa8eb23e5cf70baeb5004445d14940f01c62f37ac80017e0d26cb977986f" Mar 18 13:57:13 crc kubenswrapper[4921]: I0318 13:57:13.654889 4921 scope.go:117] "RemoveContainer" containerID="44bc585a50fed56ab23a48d188c2f506f1b04b37e336167779e738e2f343c302" Mar 18 13:57:13 crc kubenswrapper[4921]: I0318 13:57:13.708706 4921 scope.go:117] "RemoveContainer" containerID="4556ba35566cd16bc3e9089cfce9d705a6cdbc67a789dcc88db9287d062c897b" Mar 18 13:57:13 crc kubenswrapper[4921]: I0318 13:57:13.753722 4921 scope.go:117] "RemoveContainer" containerID="956b715c3d9db81be57c6842dd8e461a4552c74466ed489c5827da1000fee6b6" Mar 18 13:57:19 crc kubenswrapper[4921]: I0318 13:57:19.209374 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:57:19 crc kubenswrapper[4921]: E0318 13:57:19.210154 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:57:33 crc kubenswrapper[4921]: I0318 13:57:33.208961 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:57:33 crc kubenswrapper[4921]: E0318 13:57:33.209851 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:57:35 crc kubenswrapper[4921]: I0318 13:57:35.576367 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 13:57:46 crc kubenswrapper[4921]: I0318 13:57:46.209532 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:57:46 crc kubenswrapper[4921]: E0318 13:57:46.210540 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.782392 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-kt85k"] Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.785528 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.787795 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.796356 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-kt85k"] Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.918316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-openstack-cell1\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.918430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-config\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.918457 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-dns-svc\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.918515 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-647x2\" (UniqueName: \"kubernetes.io/projected/a5542fdb-3025-48eb-940b-0bdcd810301f-kube-api-access-647x2\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.918555 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-sb\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:50 crc kubenswrapper[4921]: I0318 13:57:50.918610 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-nb\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.020423 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-openstack-cell1\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.020515 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-config\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.020533 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-dns-svc\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.020553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-647x2\" (UniqueName: \"kubernetes.io/projected/a5542fdb-3025-48eb-940b-0bdcd810301f-kube-api-access-647x2\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.020579 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-sb\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.020622 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-nb\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.021785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-nb\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.021817 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-openstack-cell1\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.021965 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-dns-svc\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.022098 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-config\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.022266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-sb\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.039632 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-647x2\" (UniqueName: \"kubernetes.io/projected/a5542fdb-3025-48eb-940b-0bdcd810301f-kube-api-access-647x2\") pod \"dnsmasq-dns-65f77b9c99-kt85k\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.110554 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:51 crc kubenswrapper[4921]: I0318 13:57:51.586651 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-kt85k"] Mar 18 13:57:52 crc kubenswrapper[4921]: I0318 13:57:52.297393 4921 generic.go:334] "Generic (PLEG): container finished" podID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerID="c4504c431c451e0501f2c7bd5ab9b61d894c97af5fb35c297d45c6cfd8544b94" exitCode=0 Mar 18 13:57:52 crc kubenswrapper[4921]: I0318 13:57:52.297450 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" event={"ID":"a5542fdb-3025-48eb-940b-0bdcd810301f","Type":"ContainerDied","Data":"c4504c431c451e0501f2c7bd5ab9b61d894c97af5fb35c297d45c6cfd8544b94"} Mar 18 13:57:52 crc kubenswrapper[4921]: I0318 13:57:52.297947 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" event={"ID":"a5542fdb-3025-48eb-940b-0bdcd810301f","Type":"ContainerStarted","Data":"64ed38410c230d20038ee2aac8b818a1c1e1acd9e01de8e2391eaff680ff64f5"} Mar 18 13:57:53 crc kubenswrapper[4921]: I0318 13:57:53.307462 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" event={"ID":"a5542fdb-3025-48eb-940b-0bdcd810301f","Type":"ContainerStarted","Data":"64199adac35b8eb071bb595db418f94dd5a0b077b479e06d2fe73ec5f008935a"} Mar 18 13:57:53 crc kubenswrapper[4921]: I0318 13:57:53.307911 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:57:53 crc kubenswrapper[4921]: I0318 13:57:53.331512 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" podStartSLOduration=3.331494528 podStartE2EDuration="3.331494528s" podCreationTimestamp="2026-03-18 13:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:57:53.326988299 +0000 UTC m=+6492.876908948" watchObservedRunningTime="2026-03-18 13:57:53.331494528 +0000 UTC m=+6492.881415167" Mar 18 13:57:58 crc kubenswrapper[4921]: I0318 13:57:58.209367 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:57:58 crc kubenswrapper[4921]: E0318 13:57:58.210319 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.153320 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564038-xzctk"] Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.155716 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.161579 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.161581 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.162281 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.167909 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-xzctk"] Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.222552 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stk8m\" (UniqueName: \"kubernetes.io/projected/e46ce871-cef4-40f1-bccb-5beb9b745e00-kube-api-access-stk8m\") pod \"auto-csr-approver-29564038-xzctk\" (UID: \"e46ce871-cef4-40f1-bccb-5beb9b745e00\") " pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.324564 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stk8m\" (UniqueName: \"kubernetes.io/projected/e46ce871-cef4-40f1-bccb-5beb9b745e00-kube-api-access-stk8m\") pod \"auto-csr-approver-29564038-xzctk\" (UID: \"e46ce871-cef4-40f1-bccb-5beb9b745e00\") " pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.344845 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stk8m\" (UniqueName: \"kubernetes.io/projected/e46ce871-cef4-40f1-bccb-5beb9b745e00-kube-api-access-stk8m\") pod \"auto-csr-approver-29564038-xzctk\" (UID: \"e46ce871-cef4-40f1-bccb-5beb9b745e00\") " pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.481465 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:00 crc kubenswrapper[4921]: I0318 13:58:00.986625 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-xzctk"] Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.112620 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.197804 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-tvx7d"] Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.199077 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerName="dnsmasq-dns" containerID="cri-o://7bc26aa5afec8bcbf317f52dadfe253f86e8681e072c84fbb01512fbf97b9cd0" gracePeriod=10 Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.375357 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df8f9c6bc-rn6dv"] Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.377867 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.428815 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df8f9c6bc-rn6dv"] Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.466536 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-openstack-cell1\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.466650 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-ovsdbserver-sb\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.466680 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-config\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.466715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-ovsdbserver-nb\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.466845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnb5s\" (UniqueName: \"kubernetes.io/projected/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-kube-api-access-wnb5s\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.466981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-dns-svc\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.472273 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-xzctk" event={"ID":"e46ce871-cef4-40f1-bccb-5beb9b745e00","Type":"ContainerStarted","Data":"5bd7ce0bca2994863e383de5fb69889f8eaea8f2a5cbef716fe674e7b1bdae19"} Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.475381 4921 generic.go:334] "Generic (PLEG): container finished" podID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerID="7bc26aa5afec8bcbf317f52dadfe253f86e8681e072c84fbb01512fbf97b9cd0" exitCode=0 Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.475423 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" event={"ID":"f178f177-b38a-4e0c-b37b-4f00f9f9c853","Type":"ContainerDied","Data":"7bc26aa5afec8bcbf317f52dadfe253f86e8681e072c84fbb01512fbf97b9cd0"} Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.575323 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-dns-svc\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.575490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-openstack-cell1\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.575537 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-ovsdbserver-sb\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.575569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-config\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.575606 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-ovsdbserver-nb\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.575682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnb5s\" (UniqueName: \"kubernetes.io/projected/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-kube-api-access-wnb5s\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.576933 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-config\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.577516 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-ovsdbserver-nb\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.577945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-ovsdbserver-sb\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.578061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-openstack-cell1\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.578283 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-dns-svc\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.598750 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnb5s\" (UniqueName: \"kubernetes.io/projected/ace2674f-ecdd-4b84-850e-f7f40bb91fbc-kube-api-access-wnb5s\") pod \"dnsmasq-dns-df8f9c6bc-rn6dv\" (UID: \"ace2674f-ecdd-4b84-850e-f7f40bb91fbc\") " pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.734307 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.865566 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.982388 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-sb\") pod \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.983172 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg7r8\" (UniqueName: \"kubernetes.io/projected/f178f177-b38a-4e0c-b37b-4f00f9f9c853-kube-api-access-hg7r8\") pod \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.983208 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-nb\") pod \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.983415 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-dns-svc\") pod \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.983488 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-config\") pod \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\" (UID: \"f178f177-b38a-4e0c-b37b-4f00f9f9c853\") " Mar 18 13:58:01 crc kubenswrapper[4921]: I0318 13:58:01.988223 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f178f177-b38a-4e0c-b37b-4f00f9f9c853-kube-api-access-hg7r8" (OuterVolumeSpecName: "kube-api-access-hg7r8") pod "f178f177-b38a-4e0c-b37b-4f00f9f9c853" (UID: "f178f177-b38a-4e0c-b37b-4f00f9f9c853"). InnerVolumeSpecName "kube-api-access-hg7r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.040550 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-config" (OuterVolumeSpecName: "config") pod "f178f177-b38a-4e0c-b37b-4f00f9f9c853" (UID: "f178f177-b38a-4e0c-b37b-4f00f9f9c853"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.047912 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f178f177-b38a-4e0c-b37b-4f00f9f9c853" (UID: "f178f177-b38a-4e0c-b37b-4f00f9f9c853"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.050820 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f178f177-b38a-4e0c-b37b-4f00f9f9c853" (UID: "f178f177-b38a-4e0c-b37b-4f00f9f9c853"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.056324 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f178f177-b38a-4e0c-b37b-4f00f9f9c853" (UID: "f178f177-b38a-4e0c-b37b-4f00f9f9c853"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.085859 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg7r8\" (UniqueName: \"kubernetes.io/projected/f178f177-b38a-4e0c-b37b-4f00f9f9c853-kube-api-access-hg7r8\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.086174 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.086186 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.086195 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.086203 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f178f177-b38a-4e0c-b37b-4f00f9f9c853-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.246711 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df8f9c6bc-rn6dv"] Mar 18 13:58:02 crc kubenswrapper[4921]: W0318 13:58:02.251986 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace2674f_ecdd_4b84_850e_f7f40bb91fbc.slice/crio-4f753b3a66da242f2cfd8ea4c7a1b2b23571c142e6a1dc3d86339ee8530e24e8 WatchSource:0}: Error finding container 4f753b3a66da242f2cfd8ea4c7a1b2b23571c142e6a1dc3d86339ee8530e24e8: Status 404 returned error can't find the container with id 4f753b3a66da242f2cfd8ea4c7a1b2b23571c142e6a1dc3d86339ee8530e24e8 Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.486952 4921 generic.go:334] "Generic (PLEG): container finished" podID="ace2674f-ecdd-4b84-850e-f7f40bb91fbc" containerID="6ac65450340256b4f0ac5afe9d6f0a33be29f71f48f748b37d3086d3069d657d" exitCode=0 Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.487017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" event={"ID":"ace2674f-ecdd-4b84-850e-f7f40bb91fbc","Type":"ContainerDied","Data":"6ac65450340256b4f0ac5afe9d6f0a33be29f71f48f748b37d3086d3069d657d"} Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.487042 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" event={"ID":"ace2674f-ecdd-4b84-850e-f7f40bb91fbc","Type":"ContainerStarted","Data":"4f753b3a66da242f2cfd8ea4c7a1b2b23571c142e6a1dc3d86339ee8530e24e8"} Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.493965 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" event={"ID":"f178f177-b38a-4e0c-b37b-4f00f9f9c853","Type":"ContainerDied","Data":"b6172135f3ac7cac3ae030846a40c98e0a36120f1d3a7003dd09d69b26eece62"} Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.494019 4921 scope.go:117] "RemoveContainer" containerID="7bc26aa5afec8bcbf317f52dadfe253f86e8681e072c84fbb01512fbf97b9cd0" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.494028 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bb76fc-tvx7d" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.551750 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-tvx7d"] Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.552899 4921 scope.go:117] "RemoveContainer" containerID="263607671da57cf288f9aa846b77ab3d207f7ec33e6d4e60969389e66e711076" Mar 18 13:58:02 crc kubenswrapper[4921]: I0318 13:58:02.562489 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7876bb76fc-tvx7d"] Mar 18 13:58:03 crc kubenswrapper[4921]: I0318 13:58:03.226685 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" path="/var/lib/kubelet/pods/f178f177-b38a-4e0c-b37b-4f00f9f9c853/volumes" Mar 18 13:58:03 crc kubenswrapper[4921]: I0318 13:58:03.509383 4921 generic.go:334] "Generic (PLEG): container finished" podID="e46ce871-cef4-40f1-bccb-5beb9b745e00" containerID="6cb3f8aecc8e73cb106559ae90d47b6841266defba6d3964b6533e1d64deccde" exitCode=0 Mar 18 13:58:03 crc kubenswrapper[4921]: I0318 13:58:03.509437 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-xzctk" event={"ID":"e46ce871-cef4-40f1-bccb-5beb9b745e00","Type":"ContainerDied","Data":"6cb3f8aecc8e73cb106559ae90d47b6841266defba6d3964b6533e1d64deccde"} Mar 18 13:58:03 crc kubenswrapper[4921]: I0318 13:58:03.511764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" event={"ID":"ace2674f-ecdd-4b84-850e-f7f40bb91fbc","Type":"ContainerStarted","Data":"bf71e0d6f5922e36b89a47909c517a7646269b1ba17a3bbe342483cc8da6a216"} Mar 18 13:58:03 crc kubenswrapper[4921]: I0318 13:58:03.511928 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:03 crc kubenswrapper[4921]: I0318 13:58:03.550862 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" podStartSLOduration=2.550842362 podStartE2EDuration="2.550842362s" podCreationTimestamp="2026-03-18 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:58:03.544269134 +0000 UTC m=+6503.094189793" watchObservedRunningTime="2026-03-18 13:58:03.550842362 +0000 UTC m=+6503.100763001" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.028039 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.151955 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stk8m\" (UniqueName: \"kubernetes.io/projected/e46ce871-cef4-40f1-bccb-5beb9b745e00-kube-api-access-stk8m\") pod \"e46ce871-cef4-40f1-bccb-5beb9b745e00\" (UID: \"e46ce871-cef4-40f1-bccb-5beb9b745e00\") " Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.173535 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bm58q"] Mar 18 13:58:05 crc kubenswrapper[4921]: E0318 13:58:05.174193 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerName="init" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.174288 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerName="init" Mar 18 13:58:05 crc kubenswrapper[4921]: E0318 13:58:05.174397 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerName="dnsmasq-dns" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.174472 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerName="dnsmasq-dns" Mar 18 13:58:05 crc kubenswrapper[4921]: E0318 13:58:05.174535 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46ce871-cef4-40f1-bccb-5beb9b745e00" containerName="oc" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.174615 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46ce871-cef4-40f1-bccb-5beb9b745e00" containerName="oc" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.174922 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46ce871-cef4-40f1-bccb-5beb9b745e00" containerName="oc" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.174996 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f178f177-b38a-4e0c-b37b-4f00f9f9c853" containerName="dnsmasq-dns" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.176853 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.176840 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46ce871-cef4-40f1-bccb-5beb9b745e00-kube-api-access-stk8m" (OuterVolumeSpecName: "kube-api-access-stk8m") pod "e46ce871-cef4-40f1-bccb-5beb9b745e00" (UID: "e46ce871-cef4-40f1-bccb-5beb9b745e00"). InnerVolumeSpecName "kube-api-access-stk8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.183479 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm58q"] Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.254099 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-utilities\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.254451 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rj89\" (UniqueName: \"kubernetes.io/projected/db06755a-7e60-4279-bb32-2fb921e21529-kube-api-access-4rj89\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.254518 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-catalog-content\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.254723 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stk8m\" (UniqueName: \"kubernetes.io/projected/e46ce871-cef4-40f1-bccb-5beb9b745e00-kube-api-access-stk8m\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.356217 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rj89\" (UniqueName: \"kubernetes.io/projected/db06755a-7e60-4279-bb32-2fb921e21529-kube-api-access-4rj89\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.356281 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-catalog-content\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.356413 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-utilities\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.357025 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-utilities\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.357040 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-catalog-content\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.379948 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rj89\" (UniqueName: \"kubernetes.io/projected/db06755a-7e60-4279-bb32-2fb921e21529-kube-api-access-4rj89\") pod \"redhat-operators-bm58q\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.530478 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.565337 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564038-xzctk" event={"ID":"e46ce871-cef4-40f1-bccb-5beb9b745e00","Type":"ContainerDied","Data":"5bd7ce0bca2994863e383de5fb69889f8eaea8f2a5cbef716fe674e7b1bdae19"} Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.565376 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bd7ce0bca2994863e383de5fb69889f8eaea8f2a5cbef716fe674e7b1bdae19" Mar 18 13:58:05 crc kubenswrapper[4921]: I0318 13:58:05.565432 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564038-xzctk" Mar 18 13:58:06 crc kubenswrapper[4921]: I0318 13:58:06.012672 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm58q"] Mar 18 13:58:06 crc kubenswrapper[4921]: I0318 13:58:06.124546 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-q9trx"] Mar 18 13:58:06 crc kubenswrapper[4921]: I0318 13:58:06.134085 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564032-q9trx"] Mar 18 13:58:06 crc kubenswrapper[4921]: I0318 13:58:06.577546 4921 generic.go:334] "Generic (PLEG): container finished" podID="db06755a-7e60-4279-bb32-2fb921e21529" containerID="756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2" exitCode=0 Mar 18 13:58:06 crc kubenswrapper[4921]: I0318 13:58:06.577594 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerDied","Data":"756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2"} Mar 18 13:58:06 crc kubenswrapper[4921]: I0318 13:58:06.577633 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerStarted","Data":"617e907f4f323cbd9b15cdcce419bbba4205b8858973a318949f9d671f084399"} Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.225839 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5341abdc-0c56-42ee-9f32-5058baeaedcc" path="/var/lib/kubelet/pods/5341abdc-0c56-42ee-9f32-5058baeaedcc/volumes" Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.597332 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerStarted","Data":"f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc"} Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.949154 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672"] Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.951103 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.954688 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.954768 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.954841 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.955526 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:58:07 crc kubenswrapper[4921]: I0318 13:58:07.967070 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672"] Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.012035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.012202 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.012248 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.012306 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt8gd\" (UniqueName: \"kubernetes.io/projected/717f9e13-67d4-4a9b-a104-b906a75094bf-kube-api-access-qt8gd\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.012550 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.115716 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.115888 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.115931 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.116014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt8gd\" (UniqueName: \"kubernetes.io/projected/717f9e13-67d4-4a9b-a104-b906a75094bf-kube-api-access-qt8gd\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.116082 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.122417 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.122449 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.125488 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.135619 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.136536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt8gd\" (UniqueName: \"kubernetes.io/projected/717f9e13-67d4-4a9b-a104-b906a75094bf-kube-api-access-qt8gd\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cjt672\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:08 crc kubenswrapper[4921]: I0318 13:58:08.271405 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:09 crc kubenswrapper[4921]: I0318 13:58:09.190840 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672"] Mar 18 13:58:09 crc kubenswrapper[4921]: I0318 13:58:09.621055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" event={"ID":"717f9e13-67d4-4a9b-a104-b906a75094bf","Type":"ContainerStarted","Data":"151949f73f9df9976f97d2d61f5d76f40b79be51651b64d65f8a157532a2cac8"} Mar 18 13:58:10 crc kubenswrapper[4921]: I0318 13:58:10.209602 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:58:10 crc kubenswrapper[4921]: E0318 13:58:10.210224 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:58:11 crc kubenswrapper[4921]: I0318 13:58:11.735800 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-df8f9c6bc-rn6dv" Mar 18 13:58:11 crc kubenswrapper[4921]: I0318 13:58:11.835775 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-kt85k"] Mar 18 13:58:11 crc kubenswrapper[4921]: I0318 13:58:11.836420 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerName="dnsmasq-dns" containerID="cri-o://64199adac35b8eb071bb595db418f94dd5a0b077b479e06d2fe73ec5f008935a" gracePeriod=10 Mar 18 13:58:12 crc kubenswrapper[4921]: I0318 13:58:12.679553 4921 generic.go:334] "Generic (PLEG): container finished" podID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerID="64199adac35b8eb071bb595db418f94dd5a0b077b479e06d2fe73ec5f008935a" exitCode=0 Mar 18 13:58:12 crc kubenswrapper[4921]: I0318 13:58:12.680142 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" event={"ID":"a5542fdb-3025-48eb-940b-0bdcd810301f","Type":"ContainerDied","Data":"64199adac35b8eb071bb595db418f94dd5a0b077b479e06d2fe73ec5f008935a"} Mar 18 13:58:12 crc kubenswrapper[4921]: I0318 13:58:12.939517 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.035100 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-config\") pod \"a5542fdb-3025-48eb-940b-0bdcd810301f\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.035158 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-nb\") pod \"a5542fdb-3025-48eb-940b-0bdcd810301f\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.035237 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-dns-svc\") pod \"a5542fdb-3025-48eb-940b-0bdcd810301f\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.035297 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-openstack-cell1\") pod \"a5542fdb-3025-48eb-940b-0bdcd810301f\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.035341 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-647x2\" (UniqueName: \"kubernetes.io/projected/a5542fdb-3025-48eb-940b-0bdcd810301f-kube-api-access-647x2\") pod \"a5542fdb-3025-48eb-940b-0bdcd810301f\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.035364 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-sb\") pod \"a5542fdb-3025-48eb-940b-0bdcd810301f\" (UID: \"a5542fdb-3025-48eb-940b-0bdcd810301f\") " Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.064916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5542fdb-3025-48eb-940b-0bdcd810301f-kube-api-access-647x2" (OuterVolumeSpecName: "kube-api-access-647x2") pod "a5542fdb-3025-48eb-940b-0bdcd810301f" (UID: "a5542fdb-3025-48eb-940b-0bdcd810301f"). InnerVolumeSpecName "kube-api-access-647x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.138779 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-647x2\" (UniqueName: \"kubernetes.io/projected/a5542fdb-3025-48eb-940b-0bdcd810301f-kube-api-access-647x2\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.200692 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-config" (OuterVolumeSpecName: "config") pod "a5542fdb-3025-48eb-940b-0bdcd810301f" (UID: "a5542fdb-3025-48eb-940b-0bdcd810301f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.241346 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-config\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.261400 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5542fdb-3025-48eb-940b-0bdcd810301f" (UID: "a5542fdb-3025-48eb-940b-0bdcd810301f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.262898 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "a5542fdb-3025-48eb-940b-0bdcd810301f" (UID: "a5542fdb-3025-48eb-940b-0bdcd810301f"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.266834 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a5542fdb-3025-48eb-940b-0bdcd810301f" (UID: "a5542fdb-3025-48eb-940b-0bdcd810301f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.307150 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a5542fdb-3025-48eb-940b-0bdcd810301f" (UID: "a5542fdb-3025-48eb-940b-0bdcd810301f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.344327 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.344373 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.344386 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.344399 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5542fdb-3025-48eb-940b-0bdcd810301f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.692355 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" event={"ID":"a5542fdb-3025-48eb-940b-0bdcd810301f","Type":"ContainerDied","Data":"64ed38410c230d20038ee2aac8b818a1c1e1acd9e01de8e2391eaff680ff64f5"} Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.692428 4921 scope.go:117] "RemoveContainer" containerID="64199adac35b8eb071bb595db418f94dd5a0b077b479e06d2fe73ec5f008935a" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.693602 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65f77b9c99-kt85k" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.695898 4921 generic.go:334] "Generic (PLEG): container finished" podID="db06755a-7e60-4279-bb32-2fb921e21529" containerID="f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc" exitCode=0 Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.695948 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerDied","Data":"f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc"} Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.753759 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-kt85k"] Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.757645 4921 scope.go:117] "RemoveContainer" containerID="c4504c431c451e0501f2c7bd5ab9b61d894c97af5fb35c297d45c6cfd8544b94" Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.764082 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65f77b9c99-kt85k"] Mar 18 13:58:13 crc kubenswrapper[4921]: I0318 13:58:13.966264 4921 scope.go:117] "RemoveContainer" containerID="c1c985eb23db6a055e2d1ea2a93a69fb4ec8d15b260cee77865accb089c9f492" Mar 18 13:58:14 crc kubenswrapper[4921]: I0318 13:58:14.708102 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerStarted","Data":"fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece"} Mar 18 13:58:14 crc kubenswrapper[4921]: I0318 13:58:14.730875 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bm58q" podStartSLOduration=2.153856811 podStartE2EDuration="9.730858464s" podCreationTimestamp="2026-03-18 13:58:05 +0000 UTC" firstStartedPulling="2026-03-18 13:58:06.579664602 +0000 UTC m=+6506.129585241" lastFinishedPulling="2026-03-18 13:58:14.156666255 +0000 UTC m=+6513.706586894" observedRunningTime="2026-03-18 13:58:14.727184359 +0000 UTC m=+6514.277105018" watchObservedRunningTime="2026-03-18 13:58:14.730858464 +0000 UTC m=+6514.280779103" Mar 18 13:58:15 crc kubenswrapper[4921]: I0318 13:58:15.225519 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" path="/var/lib/kubelet/pods/a5542fdb-3025-48eb-940b-0bdcd810301f/volumes" Mar 18 13:58:15 crc kubenswrapper[4921]: I0318 13:58:15.531316 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:15 crc kubenswrapper[4921]: I0318 13:58:15.531610 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:16 crc kubenswrapper[4921]: I0318 13:58:16.613457 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bm58q" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="registry-server" probeResult="failure" output=< Mar 18 13:58:16 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 13:58:16 crc kubenswrapper[4921]: > Mar 18 13:58:23 crc kubenswrapper[4921]: I0318 13:58:23.841244 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" event={"ID":"717f9e13-67d4-4a9b-a104-b906a75094bf","Type":"ContainerStarted","Data":"8a57dc8808885227cf4f11d6293872d2ad73e99f9644c978bd573e956231de6b"} Mar 18 13:58:23 crc kubenswrapper[4921]: I0318 13:58:23.863642 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" podStartSLOduration=3.172069409 podStartE2EDuration="16.863554051s" podCreationTimestamp="2026-03-18 13:58:07 +0000 UTC" firstStartedPulling="2026-03-18 13:58:09.196551325 +0000 UTC m=+6508.746471964" lastFinishedPulling="2026-03-18 13:58:22.888035967 +0000 UTC m=+6522.437956606" observedRunningTime="2026-03-18 13:58:23.861231204 +0000 UTC m=+6523.411151863" watchObservedRunningTime="2026-03-18 13:58:23.863554051 +0000 UTC m=+6523.413474690" Mar 18 13:58:24 crc kubenswrapper[4921]: I0318 13:58:24.209927 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:58:24 crc kubenswrapper[4921]: E0318 13:58:24.210689 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:58:25 crc kubenswrapper[4921]: I0318 13:58:25.585618 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:25 crc kubenswrapper[4921]: I0318 13:58:25.642648 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:25 crc kubenswrapper[4921]: I0318 13:58:25.820579 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bm58q"] Mar 18 13:58:26 crc kubenswrapper[4921]: I0318 13:58:26.864235 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bm58q" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="registry-server" containerID="cri-o://fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece" gracePeriod=2 Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.347870 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.469295 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-catalog-content\") pod \"db06755a-7e60-4279-bb32-2fb921e21529\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.469715 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-utilities\") pod \"db06755a-7e60-4279-bb32-2fb921e21529\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.469897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rj89\" (UniqueName: \"kubernetes.io/projected/db06755a-7e60-4279-bb32-2fb921e21529-kube-api-access-4rj89\") pod \"db06755a-7e60-4279-bb32-2fb921e21529\" (UID: \"db06755a-7e60-4279-bb32-2fb921e21529\") " Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.470801 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-utilities" (OuterVolumeSpecName: "utilities") pod "db06755a-7e60-4279-bb32-2fb921e21529" (UID: "db06755a-7e60-4279-bb32-2fb921e21529"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.475760 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db06755a-7e60-4279-bb32-2fb921e21529-kube-api-access-4rj89" (OuterVolumeSpecName: "kube-api-access-4rj89") pod "db06755a-7e60-4279-bb32-2fb921e21529" (UID: "db06755a-7e60-4279-bb32-2fb921e21529"). InnerVolumeSpecName "kube-api-access-4rj89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.573129 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.573169 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rj89\" (UniqueName: \"kubernetes.io/projected/db06755a-7e60-4279-bb32-2fb921e21529-kube-api-access-4rj89\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.599311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db06755a-7e60-4279-bb32-2fb921e21529" (UID: "db06755a-7e60-4279-bb32-2fb921e21529"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.675423 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06755a-7e60-4279-bb32-2fb921e21529-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.901613 4921 generic.go:334] "Generic (PLEG): container finished" podID="db06755a-7e60-4279-bb32-2fb921e21529" containerID="fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece" exitCode=0 Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.901700 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm58q" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.901695 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerDied","Data":"fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece"} Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.905221 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm58q" event={"ID":"db06755a-7e60-4279-bb32-2fb921e21529","Type":"ContainerDied","Data":"617e907f4f323cbd9b15cdcce419bbba4205b8858973a318949f9d671f084399"} Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.905266 4921 scope.go:117] "RemoveContainer" containerID="fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.934780 4921 scope.go:117] "RemoveContainer" containerID="f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc" Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.955341 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bm58q"] Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.955963 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bm58q"] Mar 18 13:58:27 crc kubenswrapper[4921]: I0318 13:58:27.962626 4921 scope.go:117] "RemoveContainer" containerID="756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2" Mar 18 13:58:28 crc kubenswrapper[4921]: I0318 13:58:28.018953 4921 scope.go:117] "RemoveContainer" containerID="fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece" Mar 18 13:58:28 crc kubenswrapper[4921]: E0318 13:58:28.019570 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece\": container with ID starting with fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece not found: ID does not exist" containerID="fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece" Mar 18 13:58:28 crc kubenswrapper[4921]: I0318 13:58:28.019609 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece"} err="failed to get container status \"fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece\": rpc error: code = NotFound desc = could not find container \"fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece\": container with ID starting with fb0816436f1b31a29718482914548cf3204e92524a5876ad06438ed2fdbafece not found: ID does not exist" Mar 18 13:58:28 crc kubenswrapper[4921]: I0318 13:58:28.019654 4921 scope.go:117] "RemoveContainer" containerID="f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc" Mar 18 13:58:28 crc kubenswrapper[4921]: E0318 13:58:28.020221 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc\": container with ID starting with f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc not found: ID does not exist" containerID="f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc" Mar 18 13:58:28 crc kubenswrapper[4921]: I0318 13:58:28.020267 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc"} err="failed to get container status \"f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc\": rpc error: code = NotFound desc = could not find container \"f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc\": container with ID starting with f0e737acce7516f85715448bf9bdaef0e0b1bd570e3284c93d5074958f6c41cc not found: ID does not exist" Mar 18 13:58:28 crc kubenswrapper[4921]: I0318 13:58:28.020285 4921 scope.go:117] "RemoveContainer" containerID="756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2" Mar 18 13:58:28 crc kubenswrapper[4921]: E0318 13:58:28.022289 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2\": container with ID starting with 756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2 not found: ID does not exist" containerID="756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2" Mar 18 13:58:28 crc kubenswrapper[4921]: I0318 13:58:28.022328 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2"} err="failed to get container status \"756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2\": rpc error: code = NotFound desc = could not find container \"756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2\": container with ID starting with 756616febc147b8bd65c44a0c8ca916851e3a71325b0c350c89cd6a65c8f11f2 not found: ID does not exist" Mar 18 13:58:29 crc kubenswrapper[4921]: I0318 13:58:29.221304 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db06755a-7e60-4279-bb32-2fb921e21529" path="/var/lib/kubelet/pods/db06755a-7e60-4279-bb32-2fb921e21529/volumes" Mar 18 13:58:37 crc kubenswrapper[4921]: I0318 13:58:37.988496 4921 generic.go:334] "Generic (PLEG): container finished" podID="717f9e13-67d4-4a9b-a104-b906a75094bf" containerID="8a57dc8808885227cf4f11d6293872d2ad73e99f9644c978bd573e956231de6b" exitCode=0 Mar 18 13:58:37 crc kubenswrapper[4921]: I0318 13:58:37.988597 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" event={"ID":"717f9e13-67d4-4a9b-a104-b906a75094bf","Type":"ContainerDied","Data":"8a57dc8808885227cf4f11d6293872d2ad73e99f9644c978bd573e956231de6b"} Mar 18 13:58:38 crc kubenswrapper[4921]: I0318 13:58:38.209570 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:58:38 crc kubenswrapper[4921]: E0318 13:58:38.210040 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.443055 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.532401 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-inventory\") pod \"717f9e13-67d4-4a9b-a104-b906a75094bf\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.532952 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-pre-adoption-validation-combined-ca-bundle\") pod \"717f9e13-67d4-4a9b-a104-b906a75094bf\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.533207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ssh-key-openstack-cell1\") pod \"717f9e13-67d4-4a9b-a104-b906a75094bf\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.533374 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt8gd\" (UniqueName: \"kubernetes.io/projected/717f9e13-67d4-4a9b-a104-b906a75094bf-kube-api-access-qt8gd\") pod \"717f9e13-67d4-4a9b-a104-b906a75094bf\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.534267 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ceph\") pod \"717f9e13-67d4-4a9b-a104-b906a75094bf\" (UID: \"717f9e13-67d4-4a9b-a104-b906a75094bf\") " Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.540241 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ceph" (OuterVolumeSpecName: "ceph") pod "717f9e13-67d4-4a9b-a104-b906a75094bf" (UID: "717f9e13-67d4-4a9b-a104-b906a75094bf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.540445 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "717f9e13-67d4-4a9b-a104-b906a75094bf" (UID: "717f9e13-67d4-4a9b-a104-b906a75094bf"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.541028 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717f9e13-67d4-4a9b-a104-b906a75094bf-kube-api-access-qt8gd" (OuterVolumeSpecName: "kube-api-access-qt8gd") pod "717f9e13-67d4-4a9b-a104-b906a75094bf" (UID: "717f9e13-67d4-4a9b-a104-b906a75094bf"). InnerVolumeSpecName "kube-api-access-qt8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.566970 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "717f9e13-67d4-4a9b-a104-b906a75094bf" (UID: "717f9e13-67d4-4a9b-a104-b906a75094bf"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.572166 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-inventory" (OuterVolumeSpecName: "inventory") pod "717f9e13-67d4-4a9b-a104-b906a75094bf" (UID: "717f9e13-67d4-4a9b-a104-b906a75094bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.638087 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.638384 4921 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.638480 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.638565 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt8gd\" (UniqueName: \"kubernetes.io/projected/717f9e13-67d4-4a9b-a104-b906a75094bf-kube-api-access-qt8gd\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:39 crc kubenswrapper[4921]: I0318 13:58:39.638639 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/717f9e13-67d4-4a9b-a104-b906a75094bf-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 13:58:40 crc kubenswrapper[4921]: I0318 13:58:40.007249 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" event={"ID":"717f9e13-67d4-4a9b-a104-b906a75094bf","Type":"ContainerDied","Data":"151949f73f9df9976f97d2d61f5d76f40b79be51651b64d65f8a157532a2cac8"} Mar 18 13:58:40 crc kubenswrapper[4921]: I0318 13:58:40.007288 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="151949f73f9df9976f97d2d61f5d76f40b79be51651b64d65f8a157532a2cac8" Mar 18 13:58:40 crc kubenswrapper[4921]: I0318 13:58:40.007334 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cjt672" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.585377 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq"] Mar 18 13:58:45 crc kubenswrapper[4921]: E0318 13:58:45.586702 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="registry-server" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.586720 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="registry-server" Mar 18 13:58:45 crc kubenswrapper[4921]: E0318 13:58:45.586736 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="extract-utilities" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.586744 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="extract-utilities" Mar 18 13:58:45 crc kubenswrapper[4921]: E0318 13:58:45.586768 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerName="init" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.586776 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerName="init" Mar 18 13:58:45 crc kubenswrapper[4921]: E0318 13:58:45.586794 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerName="dnsmasq-dns" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.586801 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerName="dnsmasq-dns" Mar 18 13:58:45 crc kubenswrapper[4921]: E0318 13:58:45.586825 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="extract-content" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.586836 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="extract-content" Mar 18 13:58:45 crc kubenswrapper[4921]: E0318 13:58:45.586849 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717f9e13-67d4-4a9b-a104-b906a75094bf" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.586856 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="717f9e13-67d4-4a9b-a104-b906a75094bf" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.587037 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="db06755a-7e60-4279-bb32-2fb921e21529" containerName="registry-server" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.587052 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5542fdb-3025-48eb-940b-0bdcd810301f" containerName="dnsmasq-dns" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.587070 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="717f9e13-67d4-4a9b-a104-b906a75094bf" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.588380 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.589996 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.590257 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.592318 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.599519 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.615701 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq"] Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.659052 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.659242 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmkmb\" (UniqueName: \"kubernetes.io/projected/68781f83-55b8-448f-83e1-1981ded6fdd9-kube-api-access-bmkmb\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.659295 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.659322 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.659360 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.762169 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmkmb\" (UniqueName: \"kubernetes.io/projected/68781f83-55b8-448f-83e1-1981ded6fdd9-kube-api-access-bmkmb\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.762270 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.762310 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.762351 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.762411 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.768190 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.768213 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.768192 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.772714 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.791050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmkmb\" (UniqueName: \"kubernetes.io/projected/68781f83-55b8-448f-83e1-1981ded6fdd9-kube-api-access-bmkmb\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:45 crc kubenswrapper[4921]: I0318 13:58:45.920605 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 13:58:46 crc kubenswrapper[4921]: I0318 13:58:46.538904 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq"] Mar 18 13:58:47 crc kubenswrapper[4921]: I0318 13:58:47.070934 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" event={"ID":"68781f83-55b8-448f-83e1-1981ded6fdd9","Type":"ContainerStarted","Data":"2025e755077e42dc8fbbea1e657c64ca6083d8847db17ef474b7bc105c86f5dd"} Mar 18 13:58:47 crc kubenswrapper[4921]: I0318 13:58:47.071812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" event={"ID":"68781f83-55b8-448f-83e1-1981ded6fdd9","Type":"ContainerStarted","Data":"94bc2d3a7d02b346fd0617bfdbd8035edb8444402244b559123771b9fe743b58"} Mar 18 13:58:47 crc kubenswrapper[4921]: I0318 13:58:47.089205 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" podStartSLOduration=1.891370325 podStartE2EDuration="2.089185746s" podCreationTimestamp="2026-03-18 13:58:45 +0000 UTC" firstStartedPulling="2026-03-18 13:58:46.564476865 +0000 UTC m=+6546.114397504" lastFinishedPulling="2026-03-18 13:58:46.762292286 +0000 UTC m=+6546.312212925" observedRunningTime="2026-03-18 13:58:47.084464381 +0000 UTC m=+6546.634385030" watchObservedRunningTime="2026-03-18 13:58:47.089185746 +0000 UTC m=+6546.639106415" Mar 18 13:58:50 crc kubenswrapper[4921]: I0318 13:58:50.209638 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:58:50 crc kubenswrapper[4921]: E0318 13:58:50.210507 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:59:01 crc kubenswrapper[4921]: I0318 13:59:01.217157 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:59:01 crc kubenswrapper[4921]: E0318 13:59:01.218070 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:59:13 crc kubenswrapper[4921]: I0318 13:59:13.209609 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:59:13 crc kubenswrapper[4921]: E0318 13:59:13.210746 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:59:24 crc kubenswrapper[4921]: I0318 13:59:24.043129 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-ct24h"] Mar 18 13:59:24 crc kubenswrapper[4921]: I0318 13:59:24.054698 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-ct24h"] Mar 18 13:59:25 crc kubenswrapper[4921]: I0318 13:59:25.029163 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-2dd7-account-create-update-wmtpf"] Mar 18 13:59:25 crc kubenswrapper[4921]: I0318 13:59:25.042839 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-2dd7-account-create-update-wmtpf"] Mar 18 13:59:25 crc kubenswrapper[4921]: I0318 13:59:25.226075 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361d2424-db33-40cc-bc8e-0689aed6db2e" path="/var/lib/kubelet/pods/361d2424-db33-40cc-bc8e-0689aed6db2e/volumes" Mar 18 13:59:25 crc kubenswrapper[4921]: I0318 13:59:25.228910 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ea264c-1a44-4eb8-b2f5-62542c7865e0" path="/var/lib/kubelet/pods/d0ea264c-1a44-4eb8-b2f5-62542c7865e0/volumes" Mar 18 13:59:26 crc kubenswrapper[4921]: I0318 13:59:26.209537 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:59:26 crc kubenswrapper[4921]: E0318 13:59:26.210065 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:59:31 crc kubenswrapper[4921]: I0318 13:59:31.040313 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-6gcjk"] Mar 18 13:59:31 crc kubenswrapper[4921]: I0318 13:59:31.055855 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-a510-account-create-update-h8fgv"] Mar 18 13:59:31 crc kubenswrapper[4921]: I0318 13:59:31.065898 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-6gcjk"] Mar 18 13:59:31 crc kubenswrapper[4921]: I0318 13:59:31.075511 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-a510-account-create-update-h8fgv"] Mar 18 13:59:31 crc kubenswrapper[4921]: I0318 13:59:31.222417 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b12096d-0e60-4524-bfa7-34a2512b7292" path="/var/lib/kubelet/pods/5b12096d-0e60-4524-bfa7-34a2512b7292/volumes" Mar 18 13:59:31 crc kubenswrapper[4921]: I0318 13:59:31.224372 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa945ee-8749-453d-8d15-fc2f94c7877f" path="/var/lib/kubelet/pods/7aa945ee-8749-453d-8d15-fc2f94c7877f/volumes" Mar 18 13:59:39 crc kubenswrapper[4921]: I0318 13:59:39.210614 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:59:39 crc kubenswrapper[4921]: E0318 13:59:39.212944 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 13:59:50 crc kubenswrapper[4921]: I0318 13:59:50.209856 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 13:59:50 crc kubenswrapper[4921]: E0318 13:59:50.210641 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.160968 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6"] Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.163479 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.167727 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.168316 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.175248 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564040-j2hp4"] Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.181093 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.185348 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.185516 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.185954 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.187665 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-j2hp4"] Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.200254 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6"] Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.215094 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f65c2b9-1d0d-4f73-a228-4585b515c979-secret-volume\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.215215 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x52h\" (UniqueName: \"kubernetes.io/projected/0f65c2b9-1d0d-4f73-a228-4585b515c979-kube-api-access-6x52h\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.215399 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f65c2b9-1d0d-4f73-a228-4585b515c979-config-volume\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.317695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f65c2b9-1d0d-4f73-a228-4585b515c979-config-volume\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.318097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskqr\" (UniqueName: \"kubernetes.io/projected/1974abf9-efc3-4a7a-aa1c-6e604692767f-kube-api-access-xskqr\") pod \"auto-csr-approver-29564040-j2hp4\" (UID: \"1974abf9-efc3-4a7a-aa1c-6e604692767f\") " pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.318152 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f65c2b9-1d0d-4f73-a228-4585b515c979-secret-volume\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.318204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x52h\" (UniqueName: \"kubernetes.io/projected/0f65c2b9-1d0d-4f73-a228-4585b515c979-kube-api-access-6x52h\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.319952 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f65c2b9-1d0d-4f73-a228-4585b515c979-config-volume\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.331096 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f65c2b9-1d0d-4f73-a228-4585b515c979-secret-volume\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.348563 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x52h\" (UniqueName: \"kubernetes.io/projected/0f65c2b9-1d0d-4f73-a228-4585b515c979-kube-api-access-6x52h\") pod \"collect-profiles-29564040-lmrk6\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.420345 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xskqr\" (UniqueName: \"kubernetes.io/projected/1974abf9-efc3-4a7a-aa1c-6e604692767f-kube-api-access-xskqr\") pod \"auto-csr-approver-29564040-j2hp4\" (UID: \"1974abf9-efc3-4a7a-aa1c-6e604692767f\") " pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.437233 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xskqr\" (UniqueName: \"kubernetes.io/projected/1974abf9-efc3-4a7a-aa1c-6e604692767f-kube-api-access-xskqr\") pod \"auto-csr-approver-29564040-j2hp4\" (UID: \"1974abf9-efc3-4a7a-aa1c-6e604692767f\") " pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.496045 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:00 crc kubenswrapper[4921]: I0318 14:00:00.507977 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:01 crc kubenswrapper[4921]: I0318 14:00:01.009912 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6"] Mar 18 14:00:01 crc kubenswrapper[4921]: W0318 14:00:01.015896 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f65c2b9_1d0d_4f73_a228_4585b515c979.slice/crio-f445771bb38edcc6bdae83d0ab577af083a83c5d56f64bec7cbc449877cec7b1 WatchSource:0}: Error finding container f445771bb38edcc6bdae83d0ab577af083a83c5d56f64bec7cbc449877cec7b1: Status 404 returned error can't find the container with id f445771bb38edcc6bdae83d0ab577af083a83c5d56f64bec7cbc449877cec7b1 Mar 18 14:00:01 crc kubenswrapper[4921]: I0318 14:00:01.123188 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-j2hp4"] Mar 18 14:00:01 crc kubenswrapper[4921]: W0318 14:00:01.146821 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1974abf9_efc3_4a7a_aa1c_6e604692767f.slice/crio-e200acfd7fc0ba44843fa1834dcd0730776e4501971e8627204547e03435caa3 WatchSource:0}: Error finding container e200acfd7fc0ba44843fa1834dcd0730776e4501971e8627204547e03435caa3: Status 404 returned error can't find the container with id e200acfd7fc0ba44843fa1834dcd0730776e4501971e8627204547e03435caa3 Mar 18 14:00:01 crc kubenswrapper[4921]: I0318 14:00:01.769089 4921 generic.go:334] "Generic (PLEG): container finished" podID="0f65c2b9-1d0d-4f73-a228-4585b515c979" containerID="2192aca7b19fda3fe1e04781bfee868a9befee5b4b7b1d20c5b4c37090d5874e" exitCode=0 Mar 18 14:00:01 crc kubenswrapper[4921]: I0318 14:00:01.769378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" event={"ID":"0f65c2b9-1d0d-4f73-a228-4585b515c979","Type":"ContainerDied","Data":"2192aca7b19fda3fe1e04781bfee868a9befee5b4b7b1d20c5b4c37090d5874e"} Mar 18 14:00:01 crc kubenswrapper[4921]: I0318 14:00:01.769488 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" event={"ID":"0f65c2b9-1d0d-4f73-a228-4585b515c979","Type":"ContainerStarted","Data":"f445771bb38edcc6bdae83d0ab577af083a83c5d56f64bec7cbc449877cec7b1"} Mar 18 14:00:01 crc kubenswrapper[4921]: I0318 14:00:01.770840 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" event={"ID":"1974abf9-efc3-4a7a-aa1c-6e604692767f","Type":"ContainerStarted","Data":"e200acfd7fc0ba44843fa1834dcd0730776e4501971e8627204547e03435caa3"} Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.134927 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.176214 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f65c2b9-1d0d-4f73-a228-4585b515c979-secret-volume\") pod \"0f65c2b9-1d0d-4f73-a228-4585b515c979\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.176265 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f65c2b9-1d0d-4f73-a228-4585b515c979-config-volume\") pod \"0f65c2b9-1d0d-4f73-a228-4585b515c979\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.176495 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x52h\" (UniqueName: \"kubernetes.io/projected/0f65c2b9-1d0d-4f73-a228-4585b515c979-kube-api-access-6x52h\") pod \"0f65c2b9-1d0d-4f73-a228-4585b515c979\" (UID: \"0f65c2b9-1d0d-4f73-a228-4585b515c979\") " Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.178527 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f65c2b9-1d0d-4f73-a228-4585b515c979-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f65c2b9-1d0d-4f73-a228-4585b515c979" (UID: "0f65c2b9-1d0d-4f73-a228-4585b515c979"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.184264 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f65c2b9-1d0d-4f73-a228-4585b515c979-kube-api-access-6x52h" (OuterVolumeSpecName: "kube-api-access-6x52h") pod "0f65c2b9-1d0d-4f73-a228-4585b515c979" (UID: "0f65c2b9-1d0d-4f73-a228-4585b515c979"). InnerVolumeSpecName "kube-api-access-6x52h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.184496 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f65c2b9-1d0d-4f73-a228-4585b515c979-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f65c2b9-1d0d-4f73-a228-4585b515c979" (UID: "0f65c2b9-1d0d-4f73-a228-4585b515c979"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.210170 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 14:00:03 crc kubenswrapper[4921]: E0318 14:00:03.210486 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.279749 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f65c2b9-1d0d-4f73-a228-4585b515c979-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.279781 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f65c2b9-1d0d-4f73-a228-4585b515c979-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.279791 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x52h\" (UniqueName: \"kubernetes.io/projected/0f65c2b9-1d0d-4f73-a228-4585b515c979-kube-api-access-6x52h\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.791360 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" event={"ID":"0f65c2b9-1d0d-4f73-a228-4585b515c979","Type":"ContainerDied","Data":"f445771bb38edcc6bdae83d0ab577af083a83c5d56f64bec7cbc449877cec7b1"} Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.791406 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f445771bb38edcc6bdae83d0ab577af083a83c5d56f64bec7cbc449877cec7b1" Mar 18 14:00:03 crc kubenswrapper[4921]: I0318 14:00:03.791448 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6" Mar 18 14:00:04 crc kubenswrapper[4921]: I0318 14:00:04.429212 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr"] Mar 18 14:00:04 crc kubenswrapper[4921]: I0318 14:00:04.438792 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-x4whr"] Mar 18 14:00:05 crc kubenswrapper[4921]: I0318 14:00:05.227729 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1186373-3831-4f51-b591-a01e74e1f199" path="/var/lib/kubelet/pods/c1186373-3831-4f51-b591-a01e74e1f199/volumes" Mar 18 14:00:05 crc kubenswrapper[4921]: I0318 14:00:05.811772 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" event={"ID":"1974abf9-efc3-4a7a-aa1c-6e604692767f","Type":"ContainerStarted","Data":"d124030762d18692cc74109f23efc76d48d60419a6c2e9bd6a293510f755b775"} Mar 18 14:00:05 crc kubenswrapper[4921]: I0318 14:00:05.829798 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" podStartSLOduration=1.949456037 podStartE2EDuration="5.829776025s" podCreationTimestamp="2026-03-18 14:00:00 +0000 UTC" firstStartedPulling="2026-03-18 14:00:01.149351391 +0000 UTC m=+6620.699272030" lastFinishedPulling="2026-03-18 14:00:05.029671379 +0000 UTC m=+6624.579592018" observedRunningTime="2026-03-18 14:00:05.824875064 +0000 UTC m=+6625.374795713" watchObservedRunningTime="2026-03-18 14:00:05.829776025 +0000 UTC m=+6625.379696664" Mar 18 14:00:06 crc kubenswrapper[4921]: I0318 14:00:06.821570 4921 generic.go:334] "Generic (PLEG): container finished" podID="1974abf9-efc3-4a7a-aa1c-6e604692767f" containerID="d124030762d18692cc74109f23efc76d48d60419a6c2e9bd6a293510f755b775" exitCode=0 Mar 18 14:00:06 crc kubenswrapper[4921]: I0318 14:00:06.821870 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" event={"ID":"1974abf9-efc3-4a7a-aa1c-6e604692767f","Type":"ContainerDied","Data":"d124030762d18692cc74109f23efc76d48d60419a6c2e9bd6a293510f755b775"} Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.251128 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.479581 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xskqr\" (UniqueName: \"kubernetes.io/projected/1974abf9-efc3-4a7a-aa1c-6e604692767f-kube-api-access-xskqr\") pod \"1974abf9-efc3-4a7a-aa1c-6e604692767f\" (UID: \"1974abf9-efc3-4a7a-aa1c-6e604692767f\") " Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.486532 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1974abf9-efc3-4a7a-aa1c-6e604692767f-kube-api-access-xskqr" (OuterVolumeSpecName: "kube-api-access-xskqr") pod "1974abf9-efc3-4a7a-aa1c-6e604692767f" (UID: "1974abf9-efc3-4a7a-aa1c-6e604692767f"). InnerVolumeSpecName "kube-api-access-xskqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.583046 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xskqr\" (UniqueName: \"kubernetes.io/projected/1974abf9-efc3-4a7a-aa1c-6e604692767f-kube-api-access-xskqr\") on node \"crc\" DevicePath \"\"" Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.848798 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" event={"ID":"1974abf9-efc3-4a7a-aa1c-6e604692767f","Type":"ContainerDied","Data":"e200acfd7fc0ba44843fa1834dcd0730776e4501971e8627204547e03435caa3"} Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.849130 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e200acfd7fc0ba44843fa1834dcd0730776e4501971e8627204547e03435caa3" Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.848863 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564040-j2hp4" Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.891367 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-lbk74"] Mar 18 14:00:08 crc kubenswrapper[4921]: I0318 14:00:08.900210 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564034-lbk74"] Mar 18 14:00:09 crc kubenswrapper[4921]: I0318 14:00:09.222733 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb493be-0607-4152-ac1d-08fc9ff76476" path="/var/lib/kubelet/pods/fdb493be-0607-4152-ac1d-08fc9ff76476/volumes" Mar 18 14:00:14 crc kubenswrapper[4921]: I0318 14:00:14.146960 4921 scope.go:117] "RemoveContainer" containerID="fc5d89e6e4e55ee3c69178f99812539c7d8b1efa31c39027bfbb24cc78746eb4" Mar 18 14:00:14 crc kubenswrapper[4921]: I0318 14:00:14.217987 4921 scope.go:117] "RemoveContainer" containerID="93c3bdf7f29dc0f99cc6d903b03be0f521e595f4eb839ed21c91416a85ca6c06" Mar 18 14:00:14 crc kubenswrapper[4921]: I0318 14:00:14.258797 4921 scope.go:117] "RemoveContainer" containerID="5b5517d29514e37f774ecafd1a20150e21555e8f525b2df22cebcabb3ee45be5" Mar 18 14:00:14 crc kubenswrapper[4921]: I0318 14:00:14.312456 4921 scope.go:117] "RemoveContainer" containerID="69bf16f18bc38d9fb59da4269c36da94fb68071a9ff700ccb99360fabc907c6b" Mar 18 14:00:14 crc kubenswrapper[4921]: I0318 14:00:14.361169 4921 scope.go:117] "RemoveContainer" containerID="bdec4afe195b74f79c6942082d4eb4b734b92793c6803d8ebce791ac52bfb0c7" Mar 18 14:00:14 crc kubenswrapper[4921]: I0318 14:00:14.468959 4921 scope.go:117] "RemoveContainer" containerID="95d5594c7ab4a13802663d99a28123761e057657485e4147ad54b4ebe91a0b6d" Mar 18 14:00:16 crc kubenswrapper[4921]: I0318 14:00:16.209373 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 14:00:16 crc kubenswrapper[4921]: E0318 14:00:16.210588 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:00:20 crc kubenswrapper[4921]: I0318 14:00:20.041310 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-g72k9"] Mar 18 14:00:20 crc kubenswrapper[4921]: I0318 14:00:20.051340 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-g72k9"] Mar 18 14:00:21 crc kubenswrapper[4921]: I0318 14:00:21.224254 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31fa49e-6ecc-4079-9f4b-76b05cba5fea" path="/var/lib/kubelet/pods/b31fa49e-6ecc-4079-9f4b-76b05cba5fea/volumes" Mar 18 14:00:28 crc kubenswrapper[4921]: I0318 14:00:28.210687 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 14:00:28 crc kubenswrapper[4921]: E0318 14:00:28.211442 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:00:39 crc kubenswrapper[4921]: I0318 14:00:39.208870 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 14:00:39 crc kubenswrapper[4921]: E0318 14:00:39.209683 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:00:50 crc kubenswrapper[4921]: I0318 14:00:50.210546 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 14:00:51 crc kubenswrapper[4921]: I0318 14:00:51.304070 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"d2cd7ff9067aa80b963b291fa9725c926eb6ca5e129d5aba088028134092770a"} Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.165661 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29564041-8wr8v"] Mar 18 14:01:00 crc kubenswrapper[4921]: E0318 14:01:00.166736 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f65c2b9-1d0d-4f73-a228-4585b515c979" containerName="collect-profiles" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.166756 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f65c2b9-1d0d-4f73-a228-4585b515c979" containerName="collect-profiles" Mar 18 14:01:00 crc kubenswrapper[4921]: E0318 14:01:00.166789 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1974abf9-efc3-4a7a-aa1c-6e604692767f" containerName="oc" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.166797 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1974abf9-efc3-4a7a-aa1c-6e604692767f" containerName="oc" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.167064 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1974abf9-efc3-4a7a-aa1c-6e604692767f" containerName="oc" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.167095 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f65c2b9-1d0d-4f73-a228-4585b515c979" containerName="collect-profiles" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.167989 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.177459 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564041-8wr8v"] Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.271029 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-config-data\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.271139 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw297\" (UniqueName: \"kubernetes.io/projected/a078d82d-65b2-4164-8bac-980a6d17780c-kube-api-access-jw297\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.271410 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-fernet-keys\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.271779 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-combined-ca-bundle\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.373701 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-combined-ca-bundle\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.373817 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-config-data\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.373913 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw297\" (UniqueName: \"kubernetes.io/projected/a078d82d-65b2-4164-8bac-980a6d17780c-kube-api-access-jw297\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.374027 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-fernet-keys\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.387013 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-fernet-keys\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.387304 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-config-data\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.393208 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-combined-ca-bundle\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.395876 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw297\" (UniqueName: \"kubernetes.io/projected/a078d82d-65b2-4164-8bac-980a6d17780c-kube-api-access-jw297\") pod \"keystone-cron-29564041-8wr8v\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.486431 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:00 crc kubenswrapper[4921]: I0318 14:01:00.968399 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29564041-8wr8v"] Mar 18 14:01:01 crc kubenswrapper[4921]: I0318 14:01:01.413599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-8wr8v" event={"ID":"a078d82d-65b2-4164-8bac-980a6d17780c","Type":"ContainerStarted","Data":"4d7eda9ffe8f0ebb869a8bdd43b918b4f13b81c406f00fa7529875b08c25d5ae"} Mar 18 14:01:01 crc kubenswrapper[4921]: I0318 14:01:01.413967 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-8wr8v" event={"ID":"a078d82d-65b2-4164-8bac-980a6d17780c","Type":"ContainerStarted","Data":"8d6a8a45a0523bef19eaf18778a9cf1518bf5a99fb0453acef3ebfc413fd0189"} Mar 18 14:01:01 crc kubenswrapper[4921]: I0318 14:01:01.436255 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29564041-8wr8v" podStartSLOduration=1.4362348090000001 podStartE2EDuration="1.436234809s" podCreationTimestamp="2026-03-18 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 14:01:01.434697445 +0000 UTC m=+6680.984618084" watchObservedRunningTime="2026-03-18 14:01:01.436234809 +0000 UTC m=+6680.986155458" Mar 18 14:01:04 crc kubenswrapper[4921]: I0318 14:01:04.439776 4921 generic.go:334] "Generic (PLEG): container finished" podID="a078d82d-65b2-4164-8bac-980a6d17780c" containerID="4d7eda9ffe8f0ebb869a8bdd43b918b4f13b81c406f00fa7529875b08c25d5ae" exitCode=0 Mar 18 14:01:04 crc kubenswrapper[4921]: I0318 14:01:04.439863 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-8wr8v" event={"ID":"a078d82d-65b2-4164-8bac-980a6d17780c","Type":"ContainerDied","Data":"4d7eda9ffe8f0ebb869a8bdd43b918b4f13b81c406f00fa7529875b08c25d5ae"} Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.831458 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.885694 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-config-data\") pod \"a078d82d-65b2-4164-8bac-980a6d17780c\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.885842 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-combined-ca-bundle\") pod \"a078d82d-65b2-4164-8bac-980a6d17780c\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.885921 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw297\" (UniqueName: \"kubernetes.io/projected/a078d82d-65b2-4164-8bac-980a6d17780c-kube-api-access-jw297\") pod \"a078d82d-65b2-4164-8bac-980a6d17780c\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.886126 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-fernet-keys\") pod \"a078d82d-65b2-4164-8bac-980a6d17780c\" (UID: \"a078d82d-65b2-4164-8bac-980a6d17780c\") " Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.892939 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a078d82d-65b2-4164-8bac-980a6d17780c-kube-api-access-jw297" (OuterVolumeSpecName: "kube-api-access-jw297") pod "a078d82d-65b2-4164-8bac-980a6d17780c" (UID: "a078d82d-65b2-4164-8bac-980a6d17780c"). InnerVolumeSpecName "kube-api-access-jw297". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.893251 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a078d82d-65b2-4164-8bac-980a6d17780c" (UID: "a078d82d-65b2-4164-8bac-980a6d17780c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.919000 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a078d82d-65b2-4164-8bac-980a6d17780c" (UID: "a078d82d-65b2-4164-8bac-980a6d17780c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.978079 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-config-data" (OuterVolumeSpecName: "config-data") pod "a078d82d-65b2-4164-8bac-980a6d17780c" (UID: "a078d82d-65b2-4164-8bac-980a6d17780c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.988190 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.988238 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.988254 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw297\" (UniqueName: \"kubernetes.io/projected/a078d82d-65b2-4164-8bac-980a6d17780c-kube-api-access-jw297\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:05 crc kubenswrapper[4921]: I0318 14:01:05.988267 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a078d82d-65b2-4164-8bac-980a6d17780c-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 14:01:06 crc kubenswrapper[4921]: I0318 14:01:06.457482 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29564041-8wr8v" event={"ID":"a078d82d-65b2-4164-8bac-980a6d17780c","Type":"ContainerDied","Data":"8d6a8a45a0523bef19eaf18778a9cf1518bf5a99fb0453acef3ebfc413fd0189"} Mar 18 14:01:06 crc kubenswrapper[4921]: I0318 14:01:06.457523 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d6a8a45a0523bef19eaf18778a9cf1518bf5a99fb0453acef3ebfc413fd0189" Mar 18 14:01:06 crc kubenswrapper[4921]: I0318 14:01:06.457584 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29564041-8wr8v" Mar 18 14:01:14 crc kubenswrapper[4921]: I0318 14:01:14.678227 4921 scope.go:117] "RemoveContainer" containerID="30f444677839be1824cf2d89174630b2e3e37dde5d69b24fba8e8aa5432a456c" Mar 18 14:01:14 crc kubenswrapper[4921]: I0318 14:01:14.726153 4921 scope.go:117] "RemoveContainer" containerID="e45d64d1c687b56dd7cc7d058090faad50e017927dbc67ab286d6d3bc3d107b9" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.170504 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564042-qgq8b"] Mar 18 14:02:00 crc kubenswrapper[4921]: E0318 14:02:00.171632 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a078d82d-65b2-4164-8bac-980a6d17780c" containerName="keystone-cron" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.171649 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a078d82d-65b2-4164-8bac-980a6d17780c" containerName="keystone-cron" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.171938 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a078d82d-65b2-4164-8bac-980a6d17780c" containerName="keystone-cron" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.173038 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.175276 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.175613 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.175900 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.180230 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-qgq8b"] Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.294467 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr62r\" (UniqueName: \"kubernetes.io/projected/04ddbe27-5129-44a0-8c88-443e36afa64d-kube-api-access-gr62r\") pod \"auto-csr-approver-29564042-qgq8b\" (UID: \"04ddbe27-5129-44a0-8c88-443e36afa64d\") " pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.396525 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr62r\" (UniqueName: \"kubernetes.io/projected/04ddbe27-5129-44a0-8c88-443e36afa64d-kube-api-access-gr62r\") pod \"auto-csr-approver-29564042-qgq8b\" (UID: \"04ddbe27-5129-44a0-8c88-443e36afa64d\") " pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.414898 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr62r\" (UniqueName: \"kubernetes.io/projected/04ddbe27-5129-44a0-8c88-443e36afa64d-kube-api-access-gr62r\") pod \"auto-csr-approver-29564042-qgq8b\" (UID: \"04ddbe27-5129-44a0-8c88-443e36afa64d\") " pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.491961 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:00 crc kubenswrapper[4921]: W0318 14:02:00.983008 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ddbe27_5129_44a0_8c88_443e36afa64d.slice/crio-0e08d3c32e7dd556eee7f3d62f34bcda4709a6b8b416dda1d820d1c0fecc0212 WatchSource:0}: Error finding container 0e08d3c32e7dd556eee7f3d62f34bcda4709a6b8b416dda1d820d1c0fecc0212: Status 404 returned error can't find the container with id 0e08d3c32e7dd556eee7f3d62f34bcda4709a6b8b416dda1d820d1c0fecc0212 Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.985713 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-qgq8b"] Mar 18 14:02:00 crc kubenswrapper[4921]: I0318 14:02:00.985763 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:02:01 crc kubenswrapper[4921]: I0318 14:02:01.057966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" event={"ID":"04ddbe27-5129-44a0-8c88-443e36afa64d","Type":"ContainerStarted","Data":"0e08d3c32e7dd556eee7f3d62f34bcda4709a6b8b416dda1d820d1c0fecc0212"} Mar 18 14:02:03 crc kubenswrapper[4921]: I0318 14:02:03.082945 4921 generic.go:334] "Generic (PLEG): container finished" podID="04ddbe27-5129-44a0-8c88-443e36afa64d" containerID="1a4f19bcf99af9840423700d66ae4051dfbf12372331cdbd4d711296feaef27d" exitCode=0 Mar 18 14:02:03 crc kubenswrapper[4921]: I0318 14:02:03.083027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" event={"ID":"04ddbe27-5129-44a0-8c88-443e36afa64d","Type":"ContainerDied","Data":"1a4f19bcf99af9840423700d66ae4051dfbf12372331cdbd4d711296feaef27d"} Mar 18 14:02:04 crc kubenswrapper[4921]: I0318 14:02:04.480999 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:04 crc kubenswrapper[4921]: I0318 14:02:04.591502 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr62r\" (UniqueName: \"kubernetes.io/projected/04ddbe27-5129-44a0-8c88-443e36afa64d-kube-api-access-gr62r\") pod \"04ddbe27-5129-44a0-8c88-443e36afa64d\" (UID: \"04ddbe27-5129-44a0-8c88-443e36afa64d\") " Mar 18 14:02:04 crc kubenswrapper[4921]: I0318 14:02:04.600328 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ddbe27-5129-44a0-8c88-443e36afa64d-kube-api-access-gr62r" (OuterVolumeSpecName: "kube-api-access-gr62r") pod "04ddbe27-5129-44a0-8c88-443e36afa64d" (UID: "04ddbe27-5129-44a0-8c88-443e36afa64d"). InnerVolumeSpecName "kube-api-access-gr62r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:02:04 crc kubenswrapper[4921]: I0318 14:02:04.694233 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr62r\" (UniqueName: \"kubernetes.io/projected/04ddbe27-5129-44a0-8c88-443e36afa64d-kube-api-access-gr62r\") on node \"crc\" DevicePath \"\"" Mar 18 14:02:05 crc kubenswrapper[4921]: I0318 14:02:05.102689 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" event={"ID":"04ddbe27-5129-44a0-8c88-443e36afa64d","Type":"ContainerDied","Data":"0e08d3c32e7dd556eee7f3d62f34bcda4709a6b8b416dda1d820d1c0fecc0212"} Mar 18 14:02:05 crc kubenswrapper[4921]: I0318 14:02:05.102735 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e08d3c32e7dd556eee7f3d62f34bcda4709a6b8b416dda1d820d1c0fecc0212" Mar 18 14:02:05 crc kubenswrapper[4921]: I0318 14:02:05.102768 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564042-qgq8b" Mar 18 14:02:05 crc kubenswrapper[4921]: I0318 14:02:05.611232 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-wfcm4"] Mar 18 14:02:05 crc kubenswrapper[4921]: I0318 14:02:05.624456 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564036-wfcm4"] Mar 18 14:02:07 crc kubenswrapper[4921]: I0318 14:02:07.224351 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18516be-a913-4d85-9d32-265cc891709a" path="/var/lib/kubelet/pods/d18516be-a913-4d85-9d32-265cc891709a/volumes" Mar 18 14:02:14 crc kubenswrapper[4921]: I0318 14:02:14.806397 4921 scope.go:117] "RemoveContainer" containerID="096664877cf08671ec5e2aadfd8b4fe3c9b8263f3852f0bc32d0e6986e8d750c" Mar 18 14:03:17 crc kubenswrapper[4921]: I0318 14:03:17.081001 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:03:17 crc kubenswrapper[4921]: I0318 14:03:17.081683 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:03:33 crc kubenswrapper[4921]: I0318 14:03:33.056581 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-p8gcr"] Mar 18 14:03:33 crc kubenswrapper[4921]: I0318 14:03:33.065248 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a95a-account-create-update-px4b8"] Mar 18 14:03:33 crc kubenswrapper[4921]: I0318 14:03:33.076352 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-p8gcr"] Mar 18 14:03:33 crc kubenswrapper[4921]: I0318 14:03:33.086721 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a95a-account-create-update-px4b8"] Mar 18 14:03:33 crc kubenswrapper[4921]: I0318 14:03:33.221259 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa" path="/var/lib/kubelet/pods/9443c0cc-9206-4b98-a5ca-d9ec17bdb1fa/volumes" Mar 18 14:03:33 crc kubenswrapper[4921]: I0318 14:03:33.221866 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5" path="/var/lib/kubelet/pods/def5f6bf-6fb5-4e8e-a9cf-7f138dcc98e5/volumes" Mar 18 14:03:47 crc kubenswrapper[4921]: I0318 14:03:47.081266 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:03:47 crc kubenswrapper[4921]: I0318 14:03:47.081723 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:03:50 crc kubenswrapper[4921]: I0318 14:03:50.037710 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-qfr9v"] Mar 18 14:03:50 crc kubenswrapper[4921]: I0318 14:03:50.051225 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-qfr9v"] Mar 18 14:03:51 crc kubenswrapper[4921]: I0318 14:03:51.223255 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49dbf0c7-4c4a-484b-ab76-fb64ea938c0f" path="/var/lib/kubelet/pods/49dbf0c7-4c4a-484b-ab76-fb64ea938c0f/volumes" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.160607 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564044-mmg5m"] Mar 18 14:04:00 crc kubenswrapper[4921]: E0318 14:04:00.161697 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ddbe27-5129-44a0-8c88-443e36afa64d" containerName="oc" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.161710 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ddbe27-5129-44a0-8c88-443e36afa64d" containerName="oc" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.161892 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ddbe27-5129-44a0-8c88-443e36afa64d" containerName="oc" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.162593 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.165312 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.165485 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.165846 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.178871 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-mmg5m"] Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.218548 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dk8n\" (UniqueName: \"kubernetes.io/projected/e017f54c-f94e-4793-9818-3c6feb26da84-kube-api-access-2dk8n\") pod \"auto-csr-approver-29564044-mmg5m\" (UID: \"e017f54c-f94e-4793-9818-3c6feb26da84\") " pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.321068 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dk8n\" (UniqueName: \"kubernetes.io/projected/e017f54c-f94e-4793-9818-3c6feb26da84-kube-api-access-2dk8n\") pod \"auto-csr-approver-29564044-mmg5m\" (UID: \"e017f54c-f94e-4793-9818-3c6feb26da84\") " pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.340653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dk8n\" (UniqueName: \"kubernetes.io/projected/e017f54c-f94e-4793-9818-3c6feb26da84-kube-api-access-2dk8n\") pod \"auto-csr-approver-29564044-mmg5m\" (UID: \"e017f54c-f94e-4793-9818-3c6feb26da84\") " pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.491544 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:00 crc kubenswrapper[4921]: I0318 14:04:00.983270 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-mmg5m"] Mar 18 14:04:01 crc kubenswrapper[4921]: I0318 14:04:01.255576 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" event={"ID":"e017f54c-f94e-4793-9818-3c6feb26da84","Type":"ContainerStarted","Data":"c138d7921548f6db25ac87c6cf79b55481216fcfbc19be4ab3b20e9dcaf9e0b3"} Mar 18 14:04:03 crc kubenswrapper[4921]: I0318 14:04:03.273445 4921 generic.go:334] "Generic (PLEG): container finished" podID="e017f54c-f94e-4793-9818-3c6feb26da84" containerID="3623a18845652aa7b71d7b591409f22320cd8f9585d8229f532162bc3420d77b" exitCode=0 Mar 18 14:04:03 crc kubenswrapper[4921]: I0318 14:04:03.273502 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" event={"ID":"e017f54c-f94e-4793-9818-3c6feb26da84","Type":"ContainerDied","Data":"3623a18845652aa7b71d7b591409f22320cd8f9585d8229f532162bc3420d77b"} Mar 18 14:04:04 crc kubenswrapper[4921]: I0318 14:04:04.706213 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:04 crc kubenswrapper[4921]: I0318 14:04:04.844333 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dk8n\" (UniqueName: \"kubernetes.io/projected/e017f54c-f94e-4793-9818-3c6feb26da84-kube-api-access-2dk8n\") pod \"e017f54c-f94e-4793-9818-3c6feb26da84\" (UID: \"e017f54c-f94e-4793-9818-3c6feb26da84\") " Mar 18 14:04:04 crc kubenswrapper[4921]: I0318 14:04:04.849490 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e017f54c-f94e-4793-9818-3c6feb26da84-kube-api-access-2dk8n" (OuterVolumeSpecName: "kube-api-access-2dk8n") pod "e017f54c-f94e-4793-9818-3c6feb26da84" (UID: "e017f54c-f94e-4793-9818-3c6feb26da84"). InnerVolumeSpecName "kube-api-access-2dk8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:04:04 crc kubenswrapper[4921]: I0318 14:04:04.946946 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dk8n\" (UniqueName: \"kubernetes.io/projected/e017f54c-f94e-4793-9818-3c6feb26da84-kube-api-access-2dk8n\") on node \"crc\" DevicePath \"\"" Mar 18 14:04:05 crc kubenswrapper[4921]: I0318 14:04:05.292384 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" event={"ID":"e017f54c-f94e-4793-9818-3c6feb26da84","Type":"ContainerDied","Data":"c138d7921548f6db25ac87c6cf79b55481216fcfbc19be4ab3b20e9dcaf9e0b3"} Mar 18 14:04:05 crc kubenswrapper[4921]: I0318 14:04:05.292774 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c138d7921548f6db25ac87c6cf79b55481216fcfbc19be4ab3b20e9dcaf9e0b3" Mar 18 14:04:05 crc kubenswrapper[4921]: I0318 14:04:05.292470 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564044-mmg5m" Mar 18 14:04:05 crc kubenswrapper[4921]: I0318 14:04:05.796919 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-xzctk"] Mar 18 14:04:05 crc kubenswrapper[4921]: I0318 14:04:05.807704 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564038-xzctk"] Mar 18 14:04:07 crc kubenswrapper[4921]: I0318 14:04:07.223884 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46ce871-cef4-40f1-bccb-5beb9b745e00" path="/var/lib/kubelet/pods/e46ce871-cef4-40f1-bccb-5beb9b745e00/volumes" Mar 18 14:04:14 crc kubenswrapper[4921]: I0318 14:04:14.954988 4921 scope.go:117] "RemoveContainer" containerID="59762b8a74b0604252b8c7c467b07edb6a5c799c5c10edc6f32fbea4c944fa70" Mar 18 14:04:14 crc kubenswrapper[4921]: I0318 14:04:14.984835 4921 scope.go:117] "RemoveContainer" containerID="177a4eeeb90b5611d6ad43d0683aabcdba4e5b7c4fc91edc4b0c40a1ddf69884" Mar 18 14:04:15 crc kubenswrapper[4921]: I0318 14:04:15.048948 4921 scope.go:117] "RemoveContainer" containerID="acb77d81e41818c608172be5035eb3ec65c4978a866e71da8f463ce5c7e5af17" Mar 18 14:04:15 crc kubenswrapper[4921]: I0318 14:04:15.093006 4921 scope.go:117] "RemoveContainer" containerID="6cb3f8aecc8e73cb106559ae90d47b6841266defba6d3964b6533e1d64deccde" Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.081765 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.082161 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.082205 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.083149 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2cd7ff9067aa80b963b291fa9725c926eb6ca5e129d5aba088028134092770a"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.083241 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://d2cd7ff9067aa80b963b291fa9725c926eb6ca5e129d5aba088028134092770a" gracePeriod=600 Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.399149 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="d2cd7ff9067aa80b963b291fa9725c926eb6ca5e129d5aba088028134092770a" exitCode=0 Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.399227 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"d2cd7ff9067aa80b963b291fa9725c926eb6ca5e129d5aba088028134092770a"} Mar 18 14:04:17 crc kubenswrapper[4921]: I0318 14:04:17.399584 4921 scope.go:117] "RemoveContainer" containerID="ea26e1f7593148d4b5dfffe47b9e06095cd61d03b80854f6a3e6a19682d25d3b" Mar 18 14:04:18 crc kubenswrapper[4921]: I0318 14:04:18.411326 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354"} Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.044451 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-58d5-account-create-update-4zxmv"] Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.055586 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-kb4pr"] Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.073955 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-kb4pr"] Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.086032 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-58d5-account-create-update-4zxmv"] Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.146493 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564046-mgd5h"] Mar 18 14:06:00 crc kubenswrapper[4921]: E0318 14:06:00.147332 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e017f54c-f94e-4793-9818-3c6feb26da84" containerName="oc" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.147355 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e017f54c-f94e-4793-9818-3c6feb26da84" containerName="oc" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.147846 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e017f54c-f94e-4793-9818-3c6feb26da84" containerName="oc" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.148703 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.152308 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.152653 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.152822 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.157602 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-mgd5h"] Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.265268 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc8rz\" (UniqueName: \"kubernetes.io/projected/934596a1-2e4b-4fea-bf90-b32ae69dcd99-kube-api-access-wc8rz\") pod \"auto-csr-approver-29564046-mgd5h\" (UID: \"934596a1-2e4b-4fea-bf90-b32ae69dcd99\") " pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.366829 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc8rz\" (UniqueName: \"kubernetes.io/projected/934596a1-2e4b-4fea-bf90-b32ae69dcd99-kube-api-access-wc8rz\") pod \"auto-csr-approver-29564046-mgd5h\" (UID: \"934596a1-2e4b-4fea-bf90-b32ae69dcd99\") " pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.393951 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc8rz\" (UniqueName: \"kubernetes.io/projected/934596a1-2e4b-4fea-bf90-b32ae69dcd99-kube-api-access-wc8rz\") pod \"auto-csr-approver-29564046-mgd5h\" (UID: \"934596a1-2e4b-4fea-bf90-b32ae69dcd99\") " pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.483523 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:00 crc kubenswrapper[4921]: I0318 14:06:00.940699 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-mgd5h"] Mar 18 14:06:01 crc kubenswrapper[4921]: I0318 14:06:01.227447 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a395ac-364e-4900-a705-366bf21d4cff" path="/var/lib/kubelet/pods/b1a395ac-364e-4900-a705-366bf21d4cff/volumes" Mar 18 14:06:01 crc kubenswrapper[4921]: I0318 14:06:01.228526 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab18c2d-33f1-49e1-a4cf-f0200d43474d" path="/var/lib/kubelet/pods/dab18c2d-33f1-49e1-a4cf-f0200d43474d/volumes" Mar 18 14:06:01 crc kubenswrapper[4921]: I0318 14:06:01.390746 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" event={"ID":"934596a1-2e4b-4fea-bf90-b32ae69dcd99","Type":"ContainerStarted","Data":"3121df5c93aa2fa19b43188d2cd9e2e4cc51fd4e57be4466e46048a375ce7711"} Mar 18 14:06:03 crc kubenswrapper[4921]: I0318 14:06:03.411968 4921 generic.go:334] "Generic (PLEG): container finished" podID="934596a1-2e4b-4fea-bf90-b32ae69dcd99" containerID="77b97581e26a20ee59cc24560c806c6e0944cc0567e9467c3068131f2980bd82" exitCode=0 Mar 18 14:06:03 crc kubenswrapper[4921]: I0318 14:06:03.412022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" event={"ID":"934596a1-2e4b-4fea-bf90-b32ae69dcd99","Type":"ContainerDied","Data":"77b97581e26a20ee59cc24560c806c6e0944cc0567e9467c3068131f2980bd82"} Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.141270 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.289224 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc8rz\" (UniqueName: \"kubernetes.io/projected/934596a1-2e4b-4fea-bf90-b32ae69dcd99-kube-api-access-wc8rz\") pod \"934596a1-2e4b-4fea-bf90-b32ae69dcd99\" (UID: \"934596a1-2e4b-4fea-bf90-b32ae69dcd99\") " Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.294475 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934596a1-2e4b-4fea-bf90-b32ae69dcd99-kube-api-access-wc8rz" (OuterVolumeSpecName: "kube-api-access-wc8rz") pod "934596a1-2e4b-4fea-bf90-b32ae69dcd99" (UID: "934596a1-2e4b-4fea-bf90-b32ae69dcd99"). InnerVolumeSpecName "kube-api-access-wc8rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.391968 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc8rz\" (UniqueName: \"kubernetes.io/projected/934596a1-2e4b-4fea-bf90-b32ae69dcd99-kube-api-access-wc8rz\") on node \"crc\" DevicePath \"\"" Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.433303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" event={"ID":"934596a1-2e4b-4fea-bf90-b32ae69dcd99","Type":"ContainerDied","Data":"3121df5c93aa2fa19b43188d2cd9e2e4cc51fd4e57be4466e46048a375ce7711"} Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.433529 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3121df5c93aa2fa19b43188d2cd9e2e4cc51fd4e57be4466e46048a375ce7711" Mar 18 14:06:05 crc kubenswrapper[4921]: I0318 14:06:05.433354 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564046-mgd5h" Mar 18 14:06:06 crc kubenswrapper[4921]: I0318 14:06:06.204681 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-j2hp4"] Mar 18 14:06:06 crc kubenswrapper[4921]: I0318 14:06:06.215359 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564040-j2hp4"] Mar 18 14:06:07 crc kubenswrapper[4921]: I0318 14:06:07.225491 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1974abf9-efc3-4a7a-aa1c-6e604692767f" path="/var/lib/kubelet/pods/1974abf9-efc3-4a7a-aa1c-6e604692767f/volumes" Mar 18 14:06:10 crc kubenswrapper[4921]: I0318 14:06:10.035967 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-w92g6"] Mar 18 14:06:10 crc kubenswrapper[4921]: I0318 14:06:10.046711 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-w92g6"] Mar 18 14:06:11 crc kubenswrapper[4921]: I0318 14:06:11.231339 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef32792-2603-4b63-8bf7-3413655270db" path="/var/lib/kubelet/pods/3ef32792-2603-4b63-8bf7-3413655270db/volumes" Mar 18 14:06:15 crc kubenswrapper[4921]: I0318 14:06:15.286189 4921 scope.go:117] "RemoveContainer" containerID="27e014aae387f6c4cd7700096ce984c894a100716b0d67dde6a719c528f104ab" Mar 18 14:06:15 crc kubenswrapper[4921]: I0318 14:06:15.318553 4921 scope.go:117] "RemoveContainer" containerID="d124030762d18692cc74109f23efc76d48d60419a6c2e9bd6a293510f755b775" Mar 18 14:06:15 crc kubenswrapper[4921]: I0318 14:06:15.378815 4921 scope.go:117] "RemoveContainer" containerID="67a64396e681353d6daabf1d87f24147c70dcb7dc467543dc09c98729fb0c219" Mar 18 14:06:15 crc kubenswrapper[4921]: I0318 14:06:15.406221 4921 scope.go:117] "RemoveContainer" containerID="2f765237217e582f834742b79cbfed112a1d81853c0d2b3a85d5567135f55fcc" Mar 18 14:06:17 crc kubenswrapper[4921]: I0318 14:06:17.080750 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:06:17 crc kubenswrapper[4921]: I0318 14:06:17.081138 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.441073 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-scpsg"] Mar 18 14:06:24 crc kubenswrapper[4921]: E0318 14:06:24.442369 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934596a1-2e4b-4fea-bf90-b32ae69dcd99" containerName="oc" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.442389 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="934596a1-2e4b-4fea-bf90-b32ae69dcd99" containerName="oc" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.442677 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="934596a1-2e4b-4fea-bf90-b32ae69dcd99" containerName="oc" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.445447 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.461424 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scpsg"] Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.516565 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-catalog-content\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.516630 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-utilities\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.516813 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c87v7\" (UniqueName: \"kubernetes.io/projected/772b74c1-ee4c-46d1-94ca-640a1cff36ff-kube-api-access-c87v7\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.619208 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c87v7\" (UniqueName: \"kubernetes.io/projected/772b74c1-ee4c-46d1-94ca-640a1cff36ff-kube-api-access-c87v7\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.619485 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-catalog-content\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.619548 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-utilities\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.620129 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-catalog-content\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.620361 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-utilities\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.640137 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c87v7\" (UniqueName: \"kubernetes.io/projected/772b74c1-ee4c-46d1-94ca-640a1cff36ff-kube-api-access-c87v7\") pod \"certified-operators-scpsg\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:24 crc kubenswrapper[4921]: I0318 14:06:24.804287 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:25 crc kubenswrapper[4921]: I0318 14:06:25.408677 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scpsg"] Mar 18 14:06:25 crc kubenswrapper[4921]: I0318 14:06:25.649476 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerStarted","Data":"7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939"} Mar 18 14:06:25 crc kubenswrapper[4921]: I0318 14:06:25.649530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerStarted","Data":"32e27e572c609afeeaffb3ee962b0e9a5abd7104480df1278c75e1a4df59d1dc"} Mar 18 14:06:26 crc kubenswrapper[4921]: I0318 14:06:26.662344 4921 generic.go:334] "Generic (PLEG): container finished" podID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerID="7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939" exitCode=0 Mar 18 14:06:26 crc kubenswrapper[4921]: I0318 14:06:26.662810 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerDied","Data":"7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939"} Mar 18 14:06:27 crc kubenswrapper[4921]: I0318 14:06:27.673526 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerStarted","Data":"cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de"} Mar 18 14:06:30 crc kubenswrapper[4921]: I0318 14:06:30.701884 4921 generic.go:334] "Generic (PLEG): container finished" podID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerID="cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de" exitCode=0 Mar 18 14:06:30 crc kubenswrapper[4921]: I0318 14:06:30.701957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerDied","Data":"cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de"} Mar 18 14:06:31 crc kubenswrapper[4921]: I0318 14:06:31.714422 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerStarted","Data":"979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0"} Mar 18 14:06:31 crc kubenswrapper[4921]: I0318 14:06:31.738301 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-scpsg" podStartSLOduration=3.104436568 podStartE2EDuration="7.738278415s" podCreationTimestamp="2026-03-18 14:06:24 +0000 UTC" firstStartedPulling="2026-03-18 14:06:26.66774335 +0000 UTC m=+7006.217663999" lastFinishedPulling="2026-03-18 14:06:31.301585207 +0000 UTC m=+7010.851505846" observedRunningTime="2026-03-18 14:06:31.737912525 +0000 UTC m=+7011.287833164" watchObservedRunningTime="2026-03-18 14:06:31.738278415 +0000 UTC m=+7011.288199054" Mar 18 14:06:32 crc kubenswrapper[4921]: I0318 14:06:32.041310 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-efa8-account-create-update-5pr48"] Mar 18 14:06:32 crc kubenswrapper[4921]: I0318 14:06:32.050962 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-frttr"] Mar 18 14:06:32 crc kubenswrapper[4921]: I0318 14:06:32.061884 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-efa8-account-create-update-5pr48"] Mar 18 14:06:32 crc kubenswrapper[4921]: I0318 14:06:32.073105 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-frttr"] Mar 18 14:06:33 crc kubenswrapper[4921]: I0318 14:06:33.246212 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c55e47-621b-45db-8447-543e81a28036" path="/var/lib/kubelet/pods/69c55e47-621b-45db-8447-543e81a28036/volumes" Mar 18 14:06:33 crc kubenswrapper[4921]: I0318 14:06:33.251057 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fc5394-7a82-43c6-88fd-ccec001cf9a5" path="/var/lib/kubelet/pods/b0fc5394-7a82-43c6-88fd-ccec001cf9a5/volumes" Mar 18 14:06:34 crc kubenswrapper[4921]: I0318 14:06:34.805209 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:34 crc kubenswrapper[4921]: I0318 14:06:34.805542 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:34 crc kubenswrapper[4921]: I0318 14:06:34.853851 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:44 crc kubenswrapper[4921]: I0318 14:06:44.052607 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-n7ftq"] Mar 18 14:06:44 crc kubenswrapper[4921]: I0318 14:06:44.064408 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-n7ftq"] Mar 18 14:06:44 crc kubenswrapper[4921]: I0318 14:06:44.877276 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:44 crc kubenswrapper[4921]: I0318 14:06:44.936603 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scpsg"] Mar 18 14:06:45 crc kubenswrapper[4921]: I0318 14:06:45.227576 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0baadf0-64a5-45b5-9e26-491b274ea3d4" path="/var/lib/kubelet/pods/b0baadf0-64a5-45b5-9e26-491b274ea3d4/volumes" Mar 18 14:06:45 crc kubenswrapper[4921]: I0318 14:06:45.848737 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-scpsg" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="registry-server" containerID="cri-o://979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0" gracePeriod=2 Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.400399 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.501728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-catalog-content\") pod \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.501988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c87v7\" (UniqueName: \"kubernetes.io/projected/772b74c1-ee4c-46d1-94ca-640a1cff36ff-kube-api-access-c87v7\") pod \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.502162 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-utilities\") pod \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\" (UID: \"772b74c1-ee4c-46d1-94ca-640a1cff36ff\") " Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.502946 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-utilities" (OuterVolumeSpecName: "utilities") pod "772b74c1-ee4c-46d1-94ca-640a1cff36ff" (UID: "772b74c1-ee4c-46d1-94ca-640a1cff36ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.521592 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772b74c1-ee4c-46d1-94ca-640a1cff36ff-kube-api-access-c87v7" (OuterVolumeSpecName: "kube-api-access-c87v7") pod "772b74c1-ee4c-46d1-94ca-640a1cff36ff" (UID: "772b74c1-ee4c-46d1-94ca-640a1cff36ff"). InnerVolumeSpecName "kube-api-access-c87v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.559929 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "772b74c1-ee4c-46d1-94ca-640a1cff36ff" (UID: "772b74c1-ee4c-46d1-94ca-640a1cff36ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.604719 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.604761 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/772b74c1-ee4c-46d1-94ca-640a1cff36ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.604774 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c87v7\" (UniqueName: \"kubernetes.io/projected/772b74c1-ee4c-46d1-94ca-640a1cff36ff-kube-api-access-c87v7\") on node \"crc\" DevicePath \"\"" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.864057 4921 generic.go:334] "Generic (PLEG): container finished" podID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerID="979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0" exitCode=0 Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.864141 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerDied","Data":"979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0"} Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.864545 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scpsg" event={"ID":"772b74c1-ee4c-46d1-94ca-640a1cff36ff","Type":"ContainerDied","Data":"32e27e572c609afeeaffb3ee962b0e9a5abd7104480df1278c75e1a4df59d1dc"} Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.864569 4921 scope.go:117] "RemoveContainer" containerID="979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.864175 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scpsg" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.903769 4921 scope.go:117] "RemoveContainer" containerID="cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de" Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.913054 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scpsg"] Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.922396 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-scpsg"] Mar 18 14:06:46 crc kubenswrapper[4921]: I0318 14:06:46.938977 4921 scope.go:117] "RemoveContainer" containerID="7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.005263 4921 scope.go:117] "RemoveContainer" containerID="979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0" Mar 18 14:06:47 crc kubenswrapper[4921]: E0318 14:06:47.005796 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0\": container with ID starting with 979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0 not found: ID does not exist" containerID="979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.005841 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0"} err="failed to get container status \"979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0\": rpc error: code = NotFound desc = could not find container \"979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0\": container with ID starting with 979cac9e523b269531a47d9b44a1196b9180730dd8be647eeb22950824fae6a0 not found: ID does not exist" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.005867 4921 scope.go:117] "RemoveContainer" containerID="cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de" Mar 18 14:06:47 crc kubenswrapper[4921]: E0318 14:06:47.006340 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de\": container with ID starting with cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de not found: ID does not exist" containerID="cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.006399 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de"} err="failed to get container status \"cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de\": rpc error: code = NotFound desc = could not find container \"cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de\": container with ID starting with cf02b904bf941ddaca0cf6a175240ea623a8cb5d7917a17afab70fb3f35b99de not found: ID does not exist" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.006432 4921 scope.go:117] "RemoveContainer" containerID="7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939" Mar 18 14:06:47 crc kubenswrapper[4921]: E0318 14:06:47.007313 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939\": container with ID starting with 7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939 not found: ID does not exist" containerID="7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.007353 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939"} err="failed to get container status \"7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939\": rpc error: code = NotFound desc = could not find container \"7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939\": container with ID starting with 7aac4a8bc088f291a3a0953cd103fab57a4e8cfa3fc437343510278aa34a3939 not found: ID does not exist" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.080870 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.080925 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:06:47 crc kubenswrapper[4921]: I0318 14:06:47.220681 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" path="/var/lib/kubelet/pods/772b74c1-ee4c-46d1-94ca-640a1cff36ff/volumes" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.383522 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9rlzg"] Mar 18 14:06:53 crc kubenswrapper[4921]: E0318 14:06:53.384320 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="registry-server" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.384332 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="registry-server" Mar 18 14:06:53 crc kubenswrapper[4921]: E0318 14:06:53.384360 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="extract-content" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.384366 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="extract-content" Mar 18 14:06:53 crc kubenswrapper[4921]: E0318 14:06:53.384395 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="extract-utilities" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.384403 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="extract-utilities" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.384590 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="772b74c1-ee4c-46d1-94ca-640a1cff36ff" containerName="registry-server" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.386074 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.395208 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9rlzg"] Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.461473 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-catalog-content\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.461536 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-utilities\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.461671 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqk4\" (UniqueName: \"kubernetes.io/projected/6220d936-c39b-4b26-a500-b9190536971b-kube-api-access-xcqk4\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.563340 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqk4\" (UniqueName: \"kubernetes.io/projected/6220d936-c39b-4b26-a500-b9190536971b-kube-api-access-xcqk4\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.563475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-catalog-content\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.563506 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-utilities\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.563994 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-catalog-content\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.564015 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-utilities\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.586906 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqk4\" (UniqueName: \"kubernetes.io/projected/6220d936-c39b-4b26-a500-b9190536971b-kube-api-access-xcqk4\") pod \"community-operators-9rlzg\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:53 crc kubenswrapper[4921]: I0318 14:06:53.727477 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:06:54 crc kubenswrapper[4921]: I0318 14:06:54.262714 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9rlzg"] Mar 18 14:06:54 crc kubenswrapper[4921]: I0318 14:06:54.956785 4921 generic.go:334] "Generic (PLEG): container finished" podID="6220d936-c39b-4b26-a500-b9190536971b" containerID="973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3" exitCode=0 Mar 18 14:06:54 crc kubenswrapper[4921]: I0318 14:06:54.956829 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerDied","Data":"973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3"} Mar 18 14:06:54 crc kubenswrapper[4921]: I0318 14:06:54.957185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerStarted","Data":"4a8ddcbbbdbcfa9ecc367aacc3003ece4691cc103e5c31434537774ee19c32b8"} Mar 18 14:06:56 crc kubenswrapper[4921]: I0318 14:06:56.980530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerStarted","Data":"83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca"} Mar 18 14:06:59 crc kubenswrapper[4921]: I0318 14:06:59.003193 4921 generic.go:334] "Generic (PLEG): container finished" podID="6220d936-c39b-4b26-a500-b9190536971b" containerID="83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca" exitCode=0 Mar 18 14:06:59 crc kubenswrapper[4921]: I0318 14:06:59.003257 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerDied","Data":"83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca"} Mar 18 14:07:00 crc kubenswrapper[4921]: I0318 14:07:00.015435 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerStarted","Data":"f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4"} Mar 18 14:07:00 crc kubenswrapper[4921]: I0318 14:07:00.044706 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9rlzg" podStartSLOduration=2.510314413 podStartE2EDuration="7.04467924s" podCreationTimestamp="2026-03-18 14:06:53 +0000 UTC" firstStartedPulling="2026-03-18 14:06:54.959050932 +0000 UTC m=+7034.508971571" lastFinishedPulling="2026-03-18 14:06:59.493415759 +0000 UTC m=+7039.043336398" observedRunningTime="2026-03-18 14:07:00.041397826 +0000 UTC m=+7039.591318485" watchObservedRunningTime="2026-03-18 14:07:00.04467924 +0000 UTC m=+7039.594599879" Mar 18 14:07:03 crc kubenswrapper[4921]: I0318 14:07:03.727853 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:07:03 crc kubenswrapper[4921]: I0318 14:07:03.729469 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:07:04 crc kubenswrapper[4921]: I0318 14:07:04.775000 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9rlzg" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="registry-server" probeResult="failure" output=< Mar 18 14:07:04 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 14:07:04 crc kubenswrapper[4921]: > Mar 18 14:07:13 crc kubenswrapper[4921]: I0318 14:07:13.780660 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:07:13 crc kubenswrapper[4921]: I0318 14:07:13.835834 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:07:14 crc kubenswrapper[4921]: I0318 14:07:14.021784 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9rlzg"] Mar 18 14:07:15 crc kubenswrapper[4921]: I0318 14:07:15.573510 4921 scope.go:117] "RemoveContainer" containerID="d89abc3c68ec704d1323a0ab3498afe2d3b8133247a30f47efe56be42b09929a" Mar 18 14:07:15 crc kubenswrapper[4921]: I0318 14:07:15.613473 4921 scope.go:117] "RemoveContainer" containerID="6d1593444d1d05e1625ed5656a5748ffb87504be92122a82bb356102bae4e77d" Mar 18 14:07:15 crc kubenswrapper[4921]: I0318 14:07:15.665972 4921 scope.go:117] "RemoveContainer" containerID="b4b72efa4d8b79225c60d5cc754b68fee62e122a73f2a61b4fc7544c3352f66e" Mar 18 14:07:15 crc kubenswrapper[4921]: I0318 14:07:15.772308 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9rlzg" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="registry-server" containerID="cri-o://f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4" gracePeriod=2 Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.249916 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.312763 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcqk4\" (UniqueName: \"kubernetes.io/projected/6220d936-c39b-4b26-a500-b9190536971b-kube-api-access-xcqk4\") pod \"6220d936-c39b-4b26-a500-b9190536971b\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.312963 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-catalog-content\") pod \"6220d936-c39b-4b26-a500-b9190536971b\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.313321 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-utilities\") pod \"6220d936-c39b-4b26-a500-b9190536971b\" (UID: \"6220d936-c39b-4b26-a500-b9190536971b\") " Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.317623 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-utilities" (OuterVolumeSpecName: "utilities") pod "6220d936-c39b-4b26-a500-b9190536971b" (UID: "6220d936-c39b-4b26-a500-b9190536971b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.328478 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6220d936-c39b-4b26-a500-b9190536971b-kube-api-access-xcqk4" (OuterVolumeSpecName: "kube-api-access-xcqk4") pod "6220d936-c39b-4b26-a500-b9190536971b" (UID: "6220d936-c39b-4b26-a500-b9190536971b"). InnerVolumeSpecName "kube-api-access-xcqk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.377031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6220d936-c39b-4b26-a500-b9190536971b" (UID: "6220d936-c39b-4b26-a500-b9190536971b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.416605 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.416643 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcqk4\" (UniqueName: \"kubernetes.io/projected/6220d936-c39b-4b26-a500-b9190536971b-kube-api-access-xcqk4\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.416655 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6220d936-c39b-4b26-a500-b9190536971b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.782493 4921 generic.go:334] "Generic (PLEG): container finished" podID="6220d936-c39b-4b26-a500-b9190536971b" containerID="f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4" exitCode=0 Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.782531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerDied","Data":"f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4"} Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.782554 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9rlzg" event={"ID":"6220d936-c39b-4b26-a500-b9190536971b","Type":"ContainerDied","Data":"4a8ddcbbbdbcfa9ecc367aacc3003ece4691cc103e5c31434537774ee19c32b8"} Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.782573 4921 scope.go:117] "RemoveContainer" containerID="f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.782695 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9rlzg" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.812531 4921 scope.go:117] "RemoveContainer" containerID="83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.826275 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9rlzg"] Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.835136 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9rlzg"] Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.837428 4921 scope.go:117] "RemoveContainer" containerID="973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.893678 4921 scope.go:117] "RemoveContainer" containerID="f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4" Mar 18 14:07:16 crc kubenswrapper[4921]: E0318 14:07:16.894266 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4\": container with ID starting with f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4 not found: ID does not exist" containerID="f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.894300 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4"} err="failed to get container status \"f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4\": rpc error: code = NotFound desc = could not find container \"f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4\": container with ID starting with f17257a5bae417fb8be7398bbc3419606cf29102caa1e6d2b8eea333f6d0dcd4 not found: ID does not exist" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.894319 4921 scope.go:117] "RemoveContainer" containerID="83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca" Mar 18 14:07:16 crc kubenswrapper[4921]: E0318 14:07:16.894718 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca\": container with ID starting with 83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca not found: ID does not exist" containerID="83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.894740 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca"} err="failed to get container status \"83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca\": rpc error: code = NotFound desc = could not find container \"83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca\": container with ID starting with 83fdc185e8b8ca98cf99931bcc5f278c1cc4e985fda18143e71d874d57ca02ca not found: ID does not exist" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.894756 4921 scope.go:117] "RemoveContainer" containerID="973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3" Mar 18 14:07:16 crc kubenswrapper[4921]: E0318 14:07:16.895010 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3\": container with ID starting with 973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3 not found: ID does not exist" containerID="973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3" Mar 18 14:07:16 crc kubenswrapper[4921]: I0318 14:07:16.895031 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3"} err="failed to get container status \"973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3\": rpc error: code = NotFound desc = could not find container \"973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3\": container with ID starting with 973e4a9c850b6f3f9efc3d1857b3a6d5434b9627fe8f3174cd243a655d2f34a3 not found: ID does not exist" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.081160 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.081231 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.081313 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.082209 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.082269 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" gracePeriod=600 Mar 18 14:07:17 crc kubenswrapper[4921]: E0318 14:07:17.208672 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.228291 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6220d936-c39b-4b26-a500-b9190536971b" path="/var/lib/kubelet/pods/6220d936-c39b-4b26-a500-b9190536971b/volumes" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.795250 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" exitCode=0 Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.795294 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354"} Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.795355 4921 scope.go:117] "RemoveContainer" containerID="d2cd7ff9067aa80b963b291fa9725c926eb6ca5e129d5aba088028134092770a" Mar 18 14:07:17 crc kubenswrapper[4921]: I0318 14:07:17.796533 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:07:17 crc kubenswrapper[4921]: E0318 14:07:17.797158 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:07:29 crc kubenswrapper[4921]: I0318 14:07:29.210093 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:07:29 crc kubenswrapper[4921]: E0318 14:07:29.211066 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:07:40 crc kubenswrapper[4921]: I0318 14:07:40.209184 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:07:40 crc kubenswrapper[4921]: E0318 14:07:40.209919 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:07:51 crc kubenswrapper[4921]: I0318 14:07:51.218858 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:07:51 crc kubenswrapper[4921]: E0318 14:07:51.221051 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.153221 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564048-hrw97"] Mar 18 14:08:00 crc kubenswrapper[4921]: E0318 14:08:00.154272 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="extract-utilities" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.154290 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="extract-utilities" Mar 18 14:08:00 crc kubenswrapper[4921]: E0318 14:08:00.154323 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="registry-server" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.154331 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="registry-server" Mar 18 14:08:00 crc kubenswrapper[4921]: E0318 14:08:00.154378 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="extract-content" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.154387 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="extract-content" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.154667 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6220d936-c39b-4b26-a500-b9190536971b" containerName="registry-server" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.155545 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.157802 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.158045 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.158245 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.175693 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-hrw97"] Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.236382 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2sn\" (UniqueName: \"kubernetes.io/projected/b8b71ff6-45a1-4676-a640-daa488a2985b-kube-api-access-6m2sn\") pod \"auto-csr-approver-29564048-hrw97\" (UID: \"b8b71ff6-45a1-4676-a640-daa488a2985b\") " pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.337561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2sn\" (UniqueName: \"kubernetes.io/projected/b8b71ff6-45a1-4676-a640-daa488a2985b-kube-api-access-6m2sn\") pod \"auto-csr-approver-29564048-hrw97\" (UID: \"b8b71ff6-45a1-4676-a640-daa488a2985b\") " pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.354683 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2sn\" (UniqueName: \"kubernetes.io/projected/b8b71ff6-45a1-4676-a640-daa488a2985b-kube-api-access-6m2sn\") pod \"auto-csr-approver-29564048-hrw97\" (UID: \"b8b71ff6-45a1-4676-a640-daa488a2985b\") " pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:00 crc kubenswrapper[4921]: I0318 14:08:00.502353 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:01 crc kubenswrapper[4921]: I0318 14:08:01.008769 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-hrw97"] Mar 18 14:08:01 crc kubenswrapper[4921]: I0318 14:08:01.022501 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:08:01 crc kubenswrapper[4921]: I0318 14:08:01.257240 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-hrw97" event={"ID":"b8b71ff6-45a1-4676-a640-daa488a2985b","Type":"ContainerStarted","Data":"f657c8d5c5047bacfe34b1baab8f8056dc051014e85ffb28d7dda2e73d7371b9"} Mar 18 14:08:03 crc kubenswrapper[4921]: I0318 14:08:03.281069 4921 generic.go:334] "Generic (PLEG): container finished" podID="b8b71ff6-45a1-4676-a640-daa488a2985b" containerID="49c50ab2154ad5d0ff223eb07adfd20bb84e9ac1aaeb9c24eb566f61e126a1d0" exitCode=0 Mar 18 14:08:03 crc kubenswrapper[4921]: I0318 14:08:03.281178 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-hrw97" event={"ID":"b8b71ff6-45a1-4676-a640-daa488a2985b","Type":"ContainerDied","Data":"49c50ab2154ad5d0ff223eb07adfd20bb84e9ac1aaeb9c24eb566f61e126a1d0"} Mar 18 14:08:04 crc kubenswrapper[4921]: I0318 14:08:04.665243 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:04 crc kubenswrapper[4921]: I0318 14:08:04.830763 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m2sn\" (UniqueName: \"kubernetes.io/projected/b8b71ff6-45a1-4676-a640-daa488a2985b-kube-api-access-6m2sn\") pod \"b8b71ff6-45a1-4676-a640-daa488a2985b\" (UID: \"b8b71ff6-45a1-4676-a640-daa488a2985b\") " Mar 18 14:08:04 crc kubenswrapper[4921]: I0318 14:08:04.837529 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b71ff6-45a1-4676-a640-daa488a2985b-kube-api-access-6m2sn" (OuterVolumeSpecName: "kube-api-access-6m2sn") pod "b8b71ff6-45a1-4676-a640-daa488a2985b" (UID: "b8b71ff6-45a1-4676-a640-daa488a2985b"). InnerVolumeSpecName "kube-api-access-6m2sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:08:04 crc kubenswrapper[4921]: I0318 14:08:04.934045 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m2sn\" (UniqueName: \"kubernetes.io/projected/b8b71ff6-45a1-4676-a640-daa488a2985b-kube-api-access-6m2sn\") on node \"crc\" DevicePath \"\"" Mar 18 14:08:05 crc kubenswrapper[4921]: I0318 14:08:05.327627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564048-hrw97" event={"ID":"b8b71ff6-45a1-4676-a640-daa488a2985b","Type":"ContainerDied","Data":"f657c8d5c5047bacfe34b1baab8f8056dc051014e85ffb28d7dda2e73d7371b9"} Mar 18 14:08:05 crc kubenswrapper[4921]: I0318 14:08:05.328179 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f657c8d5c5047bacfe34b1baab8f8056dc051014e85ffb28d7dda2e73d7371b9" Mar 18 14:08:05 crc kubenswrapper[4921]: I0318 14:08:05.328268 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564048-hrw97" Mar 18 14:08:05 crc kubenswrapper[4921]: I0318 14:08:05.742291 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-qgq8b"] Mar 18 14:08:05 crc kubenswrapper[4921]: I0318 14:08:05.750696 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564042-qgq8b"] Mar 18 14:08:06 crc kubenswrapper[4921]: I0318 14:08:06.208584 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:08:06 crc kubenswrapper[4921]: E0318 14:08:06.209083 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:08:07 crc kubenswrapper[4921]: I0318 14:08:07.221249 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ddbe27-5129-44a0-8c88-443e36afa64d" path="/var/lib/kubelet/pods/04ddbe27-5129-44a0-8c88-443e36afa64d/volumes" Mar 18 14:08:15 crc kubenswrapper[4921]: I0318 14:08:15.802929 4921 scope.go:117] "RemoveContainer" containerID="1a4f19bcf99af9840423700d66ae4051dfbf12372331cdbd4d711296feaef27d" Mar 18 14:08:21 crc kubenswrapper[4921]: I0318 14:08:21.217831 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:08:21 crc kubenswrapper[4921]: E0318 14:08:21.219647 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:08:35 crc kubenswrapper[4921]: I0318 14:08:35.209629 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:08:35 crc kubenswrapper[4921]: E0318 14:08:35.210496 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:08:47 crc kubenswrapper[4921]: I0318 14:08:47.946877 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n562b"] Mar 18 14:08:47 crc kubenswrapper[4921]: E0318 14:08:47.948494 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b71ff6-45a1-4676-a640-daa488a2985b" containerName="oc" Mar 18 14:08:47 crc kubenswrapper[4921]: I0318 14:08:47.948508 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b71ff6-45a1-4676-a640-daa488a2985b" containerName="oc" Mar 18 14:08:47 crc kubenswrapper[4921]: I0318 14:08:47.948732 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b71ff6-45a1-4676-a640-daa488a2985b" containerName="oc" Mar 18 14:08:47 crc kubenswrapper[4921]: I0318 14:08:47.950194 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:47 crc kubenswrapper[4921]: I0318 14:08:47.972988 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n562b"] Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.114192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-catalog-content\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.114277 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbnl\" (UniqueName: \"kubernetes.io/projected/77f0cdb8-2225-47ad-a1e5-1aa99af08749-kube-api-access-lvbnl\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.114530 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-utilities\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.217241 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-catalog-content\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.217320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbnl\" (UniqueName: \"kubernetes.io/projected/77f0cdb8-2225-47ad-a1e5-1aa99af08749-kube-api-access-lvbnl\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.217548 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-utilities\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.218549 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-utilities\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.218727 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-catalog-content\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.254055 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbnl\" (UniqueName: \"kubernetes.io/projected/77f0cdb8-2225-47ad-a1e5-1aa99af08749-kube-api-access-lvbnl\") pod \"redhat-operators-n562b\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.273486 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.765091 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n562b"] Mar 18 14:08:48 crc kubenswrapper[4921]: I0318 14:08:48.787428 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerStarted","Data":"f859a16e47f9cbb06866edf3e555573375b600a739dc87dff4c1be70a722ca0e"} Mar 18 14:08:49 crc kubenswrapper[4921]: I0318 14:08:49.209491 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:08:49 crc kubenswrapper[4921]: E0318 14:08:49.210073 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:08:49 crc kubenswrapper[4921]: I0318 14:08:49.800075 4921 generic.go:334] "Generic (PLEG): container finished" podID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerID="4a4ea69f350fb27556ce2715e077b725dd5c8d259c8eecfb0acea034c2503125" exitCode=0 Mar 18 14:08:49 crc kubenswrapper[4921]: I0318 14:08:49.800147 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerDied","Data":"4a4ea69f350fb27556ce2715e077b725dd5c8d259c8eecfb0acea034c2503125"} Mar 18 14:08:50 crc kubenswrapper[4921]: I0318 14:08:50.818977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerStarted","Data":"2ebd4080049ed9e9e7f61a3090c81f034d68a90ef5bb5601b3a59e35e08e238b"} Mar 18 14:08:55 crc kubenswrapper[4921]: I0318 14:08:55.875015 4921 generic.go:334] "Generic (PLEG): container finished" podID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerID="2ebd4080049ed9e9e7f61a3090c81f034d68a90ef5bb5601b3a59e35e08e238b" exitCode=0 Mar 18 14:08:55 crc kubenswrapper[4921]: I0318 14:08:55.875126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerDied","Data":"2ebd4080049ed9e9e7f61a3090c81f034d68a90ef5bb5601b3a59e35e08e238b"} Mar 18 14:08:56 crc kubenswrapper[4921]: I0318 14:08:56.899975 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerStarted","Data":"43adb49e942bc694b9e20c7f1e4d1eebd21c750948cf7c22e827b492b7b1a4c4"} Mar 18 14:08:56 crc kubenswrapper[4921]: I0318 14:08:56.938646 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n562b" podStartSLOduration=3.439422087 podStartE2EDuration="9.938617121s" podCreationTimestamp="2026-03-18 14:08:47 +0000 UTC" firstStartedPulling="2026-03-18 14:08:49.803254712 +0000 UTC m=+7149.353175351" lastFinishedPulling="2026-03-18 14:08:56.302449746 +0000 UTC m=+7155.852370385" observedRunningTime="2026-03-18 14:08:56.922828553 +0000 UTC m=+7156.472749192" watchObservedRunningTime="2026-03-18 14:08:56.938617121 +0000 UTC m=+7156.488537790" Mar 18 14:08:58 crc kubenswrapper[4921]: I0318 14:08:58.274435 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:58 crc kubenswrapper[4921]: I0318 14:08:58.274754 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:08:59 crc kubenswrapper[4921]: I0318 14:08:59.322823 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n562b" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="registry-server" probeResult="failure" output=< Mar 18 14:08:59 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 14:08:59 crc kubenswrapper[4921]: > Mar 18 14:09:02 crc kubenswrapper[4921]: I0318 14:09:02.211237 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:09:02 crc kubenswrapper[4921]: E0318 14:09:02.211985 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:09:09 crc kubenswrapper[4921]: I0318 14:09:09.318201 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n562b" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="registry-server" probeResult="failure" output=< Mar 18 14:09:09 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 14:09:09 crc kubenswrapper[4921]: > Mar 18 14:09:14 crc kubenswrapper[4921]: I0318 14:09:14.209875 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:09:14 crc kubenswrapper[4921]: E0318 14:09:14.210652 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:09:18 crc kubenswrapper[4921]: I0318 14:09:18.321606 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:09:18 crc kubenswrapper[4921]: I0318 14:09:18.373411 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:09:19 crc kubenswrapper[4921]: I0318 14:09:19.156178 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n562b"] Mar 18 14:09:20 crc kubenswrapper[4921]: I0318 14:09:20.146612 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n562b" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="registry-server" containerID="cri-o://43adb49e942bc694b9e20c7f1e4d1eebd21c750948cf7c22e827b492b7b1a4c4" gracePeriod=2 Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.162010 4921 generic.go:334] "Generic (PLEG): container finished" podID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerID="43adb49e942bc694b9e20c7f1e4d1eebd21c750948cf7c22e827b492b7b1a4c4" exitCode=0 Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.162136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerDied","Data":"43adb49e942bc694b9e20c7f1e4d1eebd21c750948cf7c22e827b492b7b1a4c4"} Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.327323 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.357153 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvbnl\" (UniqueName: \"kubernetes.io/projected/77f0cdb8-2225-47ad-a1e5-1aa99af08749-kube-api-access-lvbnl\") pod \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.357222 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-utilities\") pod \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.357365 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-catalog-content\") pod \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\" (UID: \"77f0cdb8-2225-47ad-a1e5-1aa99af08749\") " Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.358244 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-utilities" (OuterVolumeSpecName: "utilities") pod "77f0cdb8-2225-47ad-a1e5-1aa99af08749" (UID: "77f0cdb8-2225-47ad-a1e5-1aa99af08749"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.378155 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f0cdb8-2225-47ad-a1e5-1aa99af08749-kube-api-access-lvbnl" (OuterVolumeSpecName: "kube-api-access-lvbnl") pod "77f0cdb8-2225-47ad-a1e5-1aa99af08749" (UID: "77f0cdb8-2225-47ad-a1e5-1aa99af08749"). InnerVolumeSpecName "kube-api-access-lvbnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.460745 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvbnl\" (UniqueName: \"kubernetes.io/projected/77f0cdb8-2225-47ad-a1e5-1aa99af08749-kube-api-access-lvbnl\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.460791 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.618260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77f0cdb8-2225-47ad-a1e5-1aa99af08749" (UID: "77f0cdb8-2225-47ad-a1e5-1aa99af08749"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:09:21 crc kubenswrapper[4921]: I0318 14:09:21.665062 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77f0cdb8-2225-47ad-a1e5-1aa99af08749-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.175851 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n562b" event={"ID":"77f0cdb8-2225-47ad-a1e5-1aa99af08749","Type":"ContainerDied","Data":"f859a16e47f9cbb06866edf3e555573375b600a739dc87dff4c1be70a722ca0e"} Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.175904 4921 scope.go:117] "RemoveContainer" containerID="43adb49e942bc694b9e20c7f1e4d1eebd21c750948cf7c22e827b492b7b1a4c4" Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.175952 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n562b" Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.212809 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n562b"] Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.213676 4921 scope.go:117] "RemoveContainer" containerID="2ebd4080049ed9e9e7f61a3090c81f034d68a90ef5bb5601b3a59e35e08e238b" Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.222233 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n562b"] Mar 18 14:09:22 crc kubenswrapper[4921]: I0318 14:09:22.244767 4921 scope.go:117] "RemoveContainer" containerID="4a4ea69f350fb27556ce2715e077b725dd5c8d259c8eecfb0acea034c2503125" Mar 18 14:09:23 crc kubenswrapper[4921]: I0318 14:09:23.225722 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" path="/var/lib/kubelet/pods/77f0cdb8-2225-47ad-a1e5-1aa99af08749/volumes" Mar 18 14:09:27 crc kubenswrapper[4921]: I0318 14:09:27.210149 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:09:27 crc kubenswrapper[4921]: E0318 14:09:27.211145 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:09:33 crc kubenswrapper[4921]: I0318 14:09:33.296399 4921 generic.go:334] "Generic (PLEG): container finished" podID="68781f83-55b8-448f-83e1-1981ded6fdd9" containerID="2025e755077e42dc8fbbea1e657c64ca6083d8847db17ef474b7bc105c86f5dd" exitCode=0 Mar 18 14:09:33 crc kubenswrapper[4921]: I0318 14:09:33.296476 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" event={"ID":"68781f83-55b8-448f-83e1-1981ded6fdd9","Type":"ContainerDied","Data":"2025e755077e42dc8fbbea1e657c64ca6083d8847db17ef474b7bc105c86f5dd"} Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.797182 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.955957 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-inventory\") pod \"68781f83-55b8-448f-83e1-1981ded6fdd9\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.956392 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ssh-key-openstack-cell1\") pod \"68781f83-55b8-448f-83e1-1981ded6fdd9\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.956594 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ceph\") pod \"68781f83-55b8-448f-83e1-1981ded6fdd9\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.956647 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmkmb\" (UniqueName: \"kubernetes.io/projected/68781f83-55b8-448f-83e1-1981ded6fdd9-kube-api-access-bmkmb\") pod \"68781f83-55b8-448f-83e1-1981ded6fdd9\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.956702 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-tripleo-cleanup-combined-ca-bundle\") pod \"68781f83-55b8-448f-83e1-1981ded6fdd9\" (UID: \"68781f83-55b8-448f-83e1-1981ded6fdd9\") " Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.962796 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ceph" (OuterVolumeSpecName: "ceph") pod "68781f83-55b8-448f-83e1-1981ded6fdd9" (UID: "68781f83-55b8-448f-83e1-1981ded6fdd9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.963157 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "68781f83-55b8-448f-83e1-1981ded6fdd9" (UID: "68781f83-55b8-448f-83e1-1981ded6fdd9"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:09:34 crc kubenswrapper[4921]: I0318 14:09:34.964031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68781f83-55b8-448f-83e1-1981ded6fdd9-kube-api-access-bmkmb" (OuterVolumeSpecName: "kube-api-access-bmkmb") pod "68781f83-55b8-448f-83e1-1981ded6fdd9" (UID: "68781f83-55b8-448f-83e1-1981ded6fdd9"). InnerVolumeSpecName "kube-api-access-bmkmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.001402 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "68781f83-55b8-448f-83e1-1981ded6fdd9" (UID: "68781f83-55b8-448f-83e1-1981ded6fdd9"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.006390 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-inventory" (OuterVolumeSpecName: "inventory") pod "68781f83-55b8-448f-83e1-1981ded6fdd9" (UID: "68781f83-55b8-448f-83e1-1981ded6fdd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.059269 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.059311 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmkmb\" (UniqueName: \"kubernetes.io/projected/68781f83-55b8-448f-83e1-1981ded6fdd9-kube-api-access-bmkmb\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.059328 4921 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.059342 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.059355 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68781f83-55b8-448f-83e1-1981ded6fdd9-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.321713 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" event={"ID":"68781f83-55b8-448f-83e1-1981ded6fdd9","Type":"ContainerDied","Data":"94bc2d3a7d02b346fd0617bfdbd8035edb8444402244b559123771b9fe743b58"} Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.321759 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94bc2d3a7d02b346fd0617bfdbd8035edb8444402244b559123771b9fe743b58" Mar 18 14:09:35 crc kubenswrapper[4921]: I0318 14:09:35.321788 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq" Mar 18 14:09:39 crc kubenswrapper[4921]: I0318 14:09:39.209912 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:09:39 crc kubenswrapper[4921]: E0318 14:09:39.210931 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.895372 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lk6pp"] Mar 18 14:09:44 crc kubenswrapper[4921]: E0318 14:09:44.896263 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="extract-utilities" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.896275 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="extract-utilities" Mar 18 14:09:44 crc kubenswrapper[4921]: E0318 14:09:44.896293 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68781f83-55b8-448f-83e1-1981ded6fdd9" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.896300 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="68781f83-55b8-448f-83e1-1981ded6fdd9" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 14:09:44 crc kubenswrapper[4921]: E0318 14:09:44.896317 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="registry-server" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.896322 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="registry-server" Mar 18 14:09:44 crc kubenswrapper[4921]: E0318 14:09:44.896330 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="extract-content" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.896336 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="extract-content" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.896543 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="68781f83-55b8-448f-83e1-1981ded6fdd9" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.896557 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f0cdb8-2225-47ad-a1e5-1aa99af08749" containerName="registry-server" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.902624 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.907584 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.907876 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.908008 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.908158 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lk6pp"] Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.921493 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.983484 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zkdx\" (UniqueName: \"kubernetes.io/projected/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-kube-api-access-7zkdx\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.983744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-inventory\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.983882 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.983921 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:44 crc kubenswrapper[4921]: I0318 14:09:44.983943 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ceph\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.086179 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zkdx\" (UniqueName: \"kubernetes.io/projected/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-kube-api-access-7zkdx\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.086336 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-inventory\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.086437 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.086477 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.086501 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ceph\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.093286 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ceph\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.093306 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-inventory\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.093826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.094008 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.118482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zkdx\" (UniqueName: \"kubernetes.io/projected/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-kube-api-access-7zkdx\") pod \"bootstrap-openstack-openstack-cell1-lk6pp\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.232680 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:09:45 crc kubenswrapper[4921]: I0318 14:09:45.825164 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-lk6pp"] Mar 18 14:09:46 crc kubenswrapper[4921]: I0318 14:09:46.428204 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" event={"ID":"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17","Type":"ContainerStarted","Data":"a855342bbb927a705367eaadce28a4418e558891f71a1d3a7646ccaf179e2432"} Mar 18 14:09:46 crc kubenswrapper[4921]: I0318 14:09:46.428786 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" event={"ID":"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17","Type":"ContainerStarted","Data":"8b6caa9e80ea7f25083d9db5bc3771076dfa716008a4725cd712383ad8238862"} Mar 18 14:09:46 crc kubenswrapper[4921]: I0318 14:09:46.466563 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" podStartSLOduration=2.244125662 podStartE2EDuration="2.466537574s" podCreationTimestamp="2026-03-18 14:09:44 +0000 UTC" firstStartedPulling="2026-03-18 14:09:45.825552983 +0000 UTC m=+7205.375473632" lastFinishedPulling="2026-03-18 14:09:46.047964915 +0000 UTC m=+7205.597885544" observedRunningTime="2026-03-18 14:09:46.444641471 +0000 UTC m=+7205.994562120" watchObservedRunningTime="2026-03-18 14:09:46.466537574 +0000 UTC m=+7206.016458223" Mar 18 14:09:51 crc kubenswrapper[4921]: I0318 14:09:51.226926 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:09:51 crc kubenswrapper[4921]: E0318 14:09:51.227718 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.153340 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564050-sbgnf"] Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.156043 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.157933 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.158322 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.161284 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.164171 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-sbgnf"] Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.231988 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljq9j\" (UniqueName: \"kubernetes.io/projected/43fda7ab-0c40-49ae-b010-ba96fdbb0574-kube-api-access-ljq9j\") pod \"auto-csr-approver-29564050-sbgnf\" (UID: \"43fda7ab-0c40-49ae-b010-ba96fdbb0574\") " pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.334497 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljq9j\" (UniqueName: \"kubernetes.io/projected/43fda7ab-0c40-49ae-b010-ba96fdbb0574-kube-api-access-ljq9j\") pod \"auto-csr-approver-29564050-sbgnf\" (UID: \"43fda7ab-0c40-49ae-b010-ba96fdbb0574\") " pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.358909 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljq9j\" (UniqueName: \"kubernetes.io/projected/43fda7ab-0c40-49ae-b010-ba96fdbb0574-kube-api-access-ljq9j\") pod \"auto-csr-approver-29564050-sbgnf\" (UID: \"43fda7ab-0c40-49ae-b010-ba96fdbb0574\") " pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.487260 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:00 crc kubenswrapper[4921]: I0318 14:10:00.947833 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-sbgnf"] Mar 18 14:10:00 crc kubenswrapper[4921]: W0318 14:10:00.955848 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43fda7ab_0c40_49ae_b010_ba96fdbb0574.slice/crio-ac7fa82fe7733de30a49fea7f100c66368100618247de7a0f9b704848040c1ac WatchSource:0}: Error finding container ac7fa82fe7733de30a49fea7f100c66368100618247de7a0f9b704848040c1ac: Status 404 returned error can't find the container with id ac7fa82fe7733de30a49fea7f100c66368100618247de7a0f9b704848040c1ac Mar 18 14:10:01 crc kubenswrapper[4921]: I0318 14:10:01.577066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" event={"ID":"43fda7ab-0c40-49ae-b010-ba96fdbb0574","Type":"ContainerStarted","Data":"ac7fa82fe7733de30a49fea7f100c66368100618247de7a0f9b704848040c1ac"} Mar 18 14:10:03 crc kubenswrapper[4921]: I0318 14:10:03.209899 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:10:03 crc kubenswrapper[4921]: E0318 14:10:03.210754 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:10:03 crc kubenswrapper[4921]: I0318 14:10:03.596204 4921 generic.go:334] "Generic (PLEG): container finished" podID="43fda7ab-0c40-49ae-b010-ba96fdbb0574" containerID="cb4b671d78387f51106752cdd2e363a524c1d6148bdc28eb66f564d0d1d92819" exitCode=0 Mar 18 14:10:03 crc kubenswrapper[4921]: I0318 14:10:03.596603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" event={"ID":"43fda7ab-0c40-49ae-b010-ba96fdbb0574","Type":"ContainerDied","Data":"cb4b671d78387f51106752cdd2e363a524c1d6148bdc28eb66f564d0d1d92819"} Mar 18 14:10:04 crc kubenswrapper[4921]: I0318 14:10:04.976559 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:05 crc kubenswrapper[4921]: I0318 14:10:05.128307 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljq9j\" (UniqueName: \"kubernetes.io/projected/43fda7ab-0c40-49ae-b010-ba96fdbb0574-kube-api-access-ljq9j\") pod \"43fda7ab-0c40-49ae-b010-ba96fdbb0574\" (UID: \"43fda7ab-0c40-49ae-b010-ba96fdbb0574\") " Mar 18 14:10:05 crc kubenswrapper[4921]: I0318 14:10:05.133916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fda7ab-0c40-49ae-b010-ba96fdbb0574-kube-api-access-ljq9j" (OuterVolumeSpecName: "kube-api-access-ljq9j") pod "43fda7ab-0c40-49ae-b010-ba96fdbb0574" (UID: "43fda7ab-0c40-49ae-b010-ba96fdbb0574"). InnerVolumeSpecName "kube-api-access-ljq9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:10:05 crc kubenswrapper[4921]: I0318 14:10:05.230871 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljq9j\" (UniqueName: \"kubernetes.io/projected/43fda7ab-0c40-49ae-b010-ba96fdbb0574-kube-api-access-ljq9j\") on node \"crc\" DevicePath \"\"" Mar 18 14:10:05 crc kubenswrapper[4921]: I0318 14:10:05.624834 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" event={"ID":"43fda7ab-0c40-49ae-b010-ba96fdbb0574","Type":"ContainerDied","Data":"ac7fa82fe7733de30a49fea7f100c66368100618247de7a0f9b704848040c1ac"} Mar 18 14:10:05 crc kubenswrapper[4921]: I0318 14:10:05.624888 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7fa82fe7733de30a49fea7f100c66368100618247de7a0f9b704848040c1ac" Mar 18 14:10:05 crc kubenswrapper[4921]: I0318 14:10:05.624959 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564050-sbgnf" Mar 18 14:10:06 crc kubenswrapper[4921]: I0318 14:10:06.061035 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-mmg5m"] Mar 18 14:10:06 crc kubenswrapper[4921]: I0318 14:10:06.070245 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564044-mmg5m"] Mar 18 14:10:07 crc kubenswrapper[4921]: I0318 14:10:07.221696 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e017f54c-f94e-4793-9818-3c6feb26da84" path="/var/lib/kubelet/pods/e017f54c-f94e-4793-9818-3c6feb26da84/volumes" Mar 18 14:10:15 crc kubenswrapper[4921]: I0318 14:10:15.925819 4921 scope.go:117] "RemoveContainer" containerID="3623a18845652aa7b71d7b591409f22320cd8f9585d8229f532162bc3420d77b" Mar 18 14:10:16 crc kubenswrapper[4921]: I0318 14:10:16.210391 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:10:16 crc kubenswrapper[4921]: E0318 14:10:16.210703 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:10:31 crc kubenswrapper[4921]: I0318 14:10:31.227716 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:10:31 crc kubenswrapper[4921]: E0318 14:10:31.228510 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:10:43 crc kubenswrapper[4921]: I0318 14:10:43.209659 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:10:43 crc kubenswrapper[4921]: E0318 14:10:43.210445 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:10:58 crc kubenswrapper[4921]: I0318 14:10:58.209455 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:10:58 crc kubenswrapper[4921]: E0318 14:10:58.210082 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:11:12 crc kubenswrapper[4921]: I0318 14:11:12.209253 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:11:12 crc kubenswrapper[4921]: E0318 14:11:12.210021 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:11:25 crc kubenswrapper[4921]: I0318 14:11:25.209324 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:11:25 crc kubenswrapper[4921]: E0318 14:11:25.210226 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:11:37 crc kubenswrapper[4921]: I0318 14:11:37.209298 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:11:37 crc kubenswrapper[4921]: E0318 14:11:37.210079 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:11:51 crc kubenswrapper[4921]: I0318 14:11:51.217775 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:11:51 crc kubenswrapper[4921]: E0318 14:11:51.220135 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.160947 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564052-zd4lh"] Mar 18 14:12:00 crc kubenswrapper[4921]: E0318 14:12:00.162017 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fda7ab-0c40-49ae-b010-ba96fdbb0574" containerName="oc" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.162035 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fda7ab-0c40-49ae-b010-ba96fdbb0574" containerName="oc" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.162344 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fda7ab-0c40-49ae-b010-ba96fdbb0574" containerName="oc" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.163566 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.167198 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.167536 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.167924 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.188341 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-zd4lh"] Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.355094 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxjt\" (UniqueName: \"kubernetes.io/projected/982ccb12-8974-497a-96d8-85f699f82204-kube-api-access-4rxjt\") pod \"auto-csr-approver-29564052-zd4lh\" (UID: \"982ccb12-8974-497a-96d8-85f699f82204\") " pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.457950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxjt\" (UniqueName: \"kubernetes.io/projected/982ccb12-8974-497a-96d8-85f699f82204-kube-api-access-4rxjt\") pod \"auto-csr-approver-29564052-zd4lh\" (UID: \"982ccb12-8974-497a-96d8-85f699f82204\") " pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.479801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxjt\" (UniqueName: \"kubernetes.io/projected/982ccb12-8974-497a-96d8-85f699f82204-kube-api-access-4rxjt\") pod \"auto-csr-approver-29564052-zd4lh\" (UID: \"982ccb12-8974-497a-96d8-85f699f82204\") " pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:00 crc kubenswrapper[4921]: I0318 14:12:00.490643 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:01 crc kubenswrapper[4921]: I0318 14:12:01.199974 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-zd4lh"] Mar 18 14:12:02 crc kubenswrapper[4921]: I0318 14:12:02.164056 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" event={"ID":"982ccb12-8974-497a-96d8-85f699f82204","Type":"ContainerStarted","Data":"f8ba3a6eaf1ec6973e17c25c46854fbbb0c6159fe5f116a10d8d3c29311e00ce"} Mar 18 14:12:02 crc kubenswrapper[4921]: I0318 14:12:02.209648 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:12:02 crc kubenswrapper[4921]: E0318 14:12:02.210028 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:12:04 crc kubenswrapper[4921]: I0318 14:12:04.181446 4921 generic.go:334] "Generic (PLEG): container finished" podID="982ccb12-8974-497a-96d8-85f699f82204" containerID="67203d3bd278ddbcf0dbff5fc39ffe09fcac385d80548180f4845f0c69f20ae1" exitCode=0 Mar 18 14:12:04 crc kubenswrapper[4921]: I0318 14:12:04.181494 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" event={"ID":"982ccb12-8974-497a-96d8-85f699f82204","Type":"ContainerDied","Data":"67203d3bd278ddbcf0dbff5fc39ffe09fcac385d80548180f4845f0c69f20ae1"} Mar 18 14:12:05 crc kubenswrapper[4921]: I0318 14:12:05.586403 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:05 crc kubenswrapper[4921]: I0318 14:12:05.689597 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxjt\" (UniqueName: \"kubernetes.io/projected/982ccb12-8974-497a-96d8-85f699f82204-kube-api-access-4rxjt\") pod \"982ccb12-8974-497a-96d8-85f699f82204\" (UID: \"982ccb12-8974-497a-96d8-85f699f82204\") " Mar 18 14:12:05 crc kubenswrapper[4921]: I0318 14:12:05.696311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982ccb12-8974-497a-96d8-85f699f82204-kube-api-access-4rxjt" (OuterVolumeSpecName: "kube-api-access-4rxjt") pod "982ccb12-8974-497a-96d8-85f699f82204" (UID: "982ccb12-8974-497a-96d8-85f699f82204"). InnerVolumeSpecName "kube-api-access-4rxjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:12:05 crc kubenswrapper[4921]: I0318 14:12:05.791983 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxjt\" (UniqueName: \"kubernetes.io/projected/982ccb12-8974-497a-96d8-85f699f82204-kube-api-access-4rxjt\") on node \"crc\" DevicePath \"\"" Mar 18 14:12:06 crc kubenswrapper[4921]: I0318 14:12:06.200445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" event={"ID":"982ccb12-8974-497a-96d8-85f699f82204","Type":"ContainerDied","Data":"f8ba3a6eaf1ec6973e17c25c46854fbbb0c6159fe5f116a10d8d3c29311e00ce"} Mar 18 14:12:06 crc kubenswrapper[4921]: I0318 14:12:06.200835 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8ba3a6eaf1ec6973e17c25c46854fbbb0c6159fe5f116a10d8d3c29311e00ce" Mar 18 14:12:06 crc kubenswrapper[4921]: I0318 14:12:06.200512 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564052-zd4lh" Mar 18 14:12:06 crc kubenswrapper[4921]: I0318 14:12:06.669341 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-mgd5h"] Mar 18 14:12:06 crc kubenswrapper[4921]: I0318 14:12:06.681485 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564046-mgd5h"] Mar 18 14:12:07 crc kubenswrapper[4921]: I0318 14:12:07.221165 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934596a1-2e4b-4fea-bf90-b32ae69dcd99" path="/var/lib/kubelet/pods/934596a1-2e4b-4fea-bf90-b32ae69dcd99/volumes" Mar 18 14:12:13 crc kubenswrapper[4921]: I0318 14:12:13.208974 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:12:13 crc kubenswrapper[4921]: E0318 14:12:13.209702 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:12:16 crc kubenswrapper[4921]: I0318 14:12:16.069700 4921 scope.go:117] "RemoveContainer" containerID="77b97581e26a20ee59cc24560c806c6e0944cc0567e9467c3068131f2980bd82" Mar 18 14:12:24 crc kubenswrapper[4921]: I0318 14:12:24.209045 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:12:25 crc kubenswrapper[4921]: I0318 14:12:25.381570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"765723c54e70151d9ddbc2c17504b39000a3457302886d7aa69d6270fb6acb47"} Mar 18 14:13:02 crc kubenswrapper[4921]: I0318 14:13:02.767961 4921 generic.go:334] "Generic (PLEG): container finished" podID="ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" containerID="a855342bbb927a705367eaadce28a4418e558891f71a1d3a7646ccaf179e2432" exitCode=0 Mar 18 14:13:02 crc kubenswrapper[4921]: I0318 14:13:02.768520 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" event={"ID":"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17","Type":"ContainerDied","Data":"a855342bbb927a705367eaadce28a4418e558891f71a1d3a7646ccaf179e2432"} Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.388475 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.395467 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-inventory\") pod \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.395600 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-bootstrap-combined-ca-bundle\") pod \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.395770 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ssh-key-openstack-cell1\") pod \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.395987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zkdx\" (UniqueName: \"kubernetes.io/projected/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-kube-api-access-7zkdx\") pod \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.396151 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ceph\") pod \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\" (UID: \"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17\") " Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.401561 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" (UID: "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.402108 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ceph" (OuterVolumeSpecName: "ceph") pod "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" (UID: "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.408006 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-kube-api-access-7zkdx" (OuterVolumeSpecName: "kube-api-access-7zkdx") pod "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" (UID: "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17"). InnerVolumeSpecName "kube-api-access-7zkdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.430412 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-inventory" (OuterVolumeSpecName: "inventory") pod "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" (UID: "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.432558 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" (UID: "ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.499225 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.499279 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zkdx\" (UniqueName: \"kubernetes.io/projected/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-kube-api-access-7zkdx\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.499291 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.499301 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.499310 4921 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.791226 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" event={"ID":"ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17","Type":"ContainerDied","Data":"8b6caa9e80ea7f25083d9db5bc3771076dfa716008a4725cd712383ad8238862"} Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.791289 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6caa9e80ea7f25083d9db5bc3771076dfa716008a4725cd712383ad8238862" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.791381 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-lk6pp" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.890552 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mpg2w"] Mar 18 14:13:04 crc kubenswrapper[4921]: E0318 14:13:04.891549 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" containerName="bootstrap-openstack-openstack-cell1" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.891620 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" containerName="bootstrap-openstack-openstack-cell1" Mar 18 14:13:04 crc kubenswrapper[4921]: E0318 14:13:04.891715 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982ccb12-8974-497a-96d8-85f699f82204" containerName="oc" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.891781 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="982ccb12-8974-497a-96d8-85f699f82204" containerName="oc" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.892175 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="982ccb12-8974-497a-96d8-85f699f82204" containerName="oc" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.892223 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17" containerName="bootstrap-openstack-openstack-cell1" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.893146 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.895023 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.895322 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.896133 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.899213 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.905236 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-inventory\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.905464 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmk89\" (UniqueName: \"kubernetes.io/projected/57495fff-4b60-4b94-91ba-f9c34a6afd6c-kube-api-access-vmk89\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.905614 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ceph\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.905839 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:04 crc kubenswrapper[4921]: I0318 14:13:04.910929 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mpg2w"] Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.008468 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmk89\" (UniqueName: \"kubernetes.io/projected/57495fff-4b60-4b94-91ba-f9c34a6afd6c-kube-api-access-vmk89\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.008616 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ceph\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.008695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.008735 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-inventory\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.016079 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ceph\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.016144 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.019618 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-inventory\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.025622 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmk89\" (UniqueName: \"kubernetes.io/projected/57495fff-4b60-4b94-91ba-f9c34a6afd6c-kube-api-access-vmk89\") pod \"download-cache-openstack-openstack-cell1-mpg2w\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:05 crc kubenswrapper[4921]: I0318 14:13:05.214463 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:13:06 crc kubenswrapper[4921]: I0318 14:13:06.418420 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mpg2w"] Mar 18 14:13:06 crc kubenswrapper[4921]: I0318 14:13:06.430170 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:13:06 crc kubenswrapper[4921]: I0318 14:13:06.811941 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" event={"ID":"57495fff-4b60-4b94-91ba-f9c34a6afd6c","Type":"ContainerStarted","Data":"508234c7e9bf8ec943113ba616d928e3f81fb63e32c370765a8bb49f043b1cb1"} Mar 18 14:13:07 crc kubenswrapper[4921]: I0318 14:13:07.834763 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" event={"ID":"57495fff-4b60-4b94-91ba-f9c34a6afd6c","Type":"ContainerStarted","Data":"4145cddedf4b1fe1b19557244b380d032493a3bbeaa099740ee2bd8bbaacb311"} Mar 18 14:13:07 crc kubenswrapper[4921]: I0318 14:13:07.859594 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" podStartSLOduration=3.671405178 podStartE2EDuration="3.859577014s" podCreationTimestamp="2026-03-18 14:13:04 +0000 UTC" firstStartedPulling="2026-03-18 14:13:06.429901761 +0000 UTC m=+7405.979822400" lastFinishedPulling="2026-03-18 14:13:06.618073597 +0000 UTC m=+7406.167994236" observedRunningTime="2026-03-18 14:13:07.859507692 +0000 UTC m=+7407.409428341" watchObservedRunningTime="2026-03-18 14:13:07.859577014 +0000 UTC m=+7407.409497653" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.149527 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564054-x4jlk"] Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.151981 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.154393 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.155744 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.155938 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.160935 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-x4jlk"] Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.295602 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzzc\" (UniqueName: \"kubernetes.io/projected/9561c92b-ec12-486a-9135-7bd652668deb-kube-api-access-4xzzc\") pod \"auto-csr-approver-29564054-x4jlk\" (UID: \"9561c92b-ec12-486a-9135-7bd652668deb\") " pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.398292 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzzc\" (UniqueName: \"kubernetes.io/projected/9561c92b-ec12-486a-9135-7bd652668deb-kube-api-access-4xzzc\") pod \"auto-csr-approver-29564054-x4jlk\" (UID: \"9561c92b-ec12-486a-9135-7bd652668deb\") " pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.423335 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzzc\" (UniqueName: \"kubernetes.io/projected/9561c92b-ec12-486a-9135-7bd652668deb-kube-api-access-4xzzc\") pod \"auto-csr-approver-29564054-x4jlk\" (UID: \"9561c92b-ec12-486a-9135-7bd652668deb\") " pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.475199 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:00 crc kubenswrapper[4921]: I0318 14:14:00.978216 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-x4jlk"] Mar 18 14:14:00 crc kubenswrapper[4921]: W0318 14:14:00.982166 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9561c92b_ec12_486a_9135_7bd652668deb.slice/crio-4e70ed4d013d5c4571dee66fdfa29ce31c1b279b7297675b0cc44ad5aabe8a9d WatchSource:0}: Error finding container 4e70ed4d013d5c4571dee66fdfa29ce31c1b279b7297675b0cc44ad5aabe8a9d: Status 404 returned error can't find the container with id 4e70ed4d013d5c4571dee66fdfa29ce31c1b279b7297675b0cc44ad5aabe8a9d Mar 18 14:14:01 crc kubenswrapper[4921]: I0318 14:14:01.417344 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" event={"ID":"9561c92b-ec12-486a-9135-7bd652668deb","Type":"ContainerStarted","Data":"4e70ed4d013d5c4571dee66fdfa29ce31c1b279b7297675b0cc44ad5aabe8a9d"} Mar 18 14:14:03 crc kubenswrapper[4921]: I0318 14:14:03.439626 4921 generic.go:334] "Generic (PLEG): container finished" podID="9561c92b-ec12-486a-9135-7bd652668deb" containerID="1a0ee61789ba7c3b295ca5cdc360c689f6ee1ce1941c0bc867434da5c84467bd" exitCode=0 Mar 18 14:14:03 crc kubenswrapper[4921]: I0318 14:14:03.439708 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" event={"ID":"9561c92b-ec12-486a-9135-7bd652668deb","Type":"ContainerDied","Data":"1a0ee61789ba7c3b295ca5cdc360c689f6ee1ce1941c0bc867434da5c84467bd"} Mar 18 14:14:04 crc kubenswrapper[4921]: I0318 14:14:04.814537 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:04 crc kubenswrapper[4921]: I0318 14:14:04.909212 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzzc\" (UniqueName: \"kubernetes.io/projected/9561c92b-ec12-486a-9135-7bd652668deb-kube-api-access-4xzzc\") pod \"9561c92b-ec12-486a-9135-7bd652668deb\" (UID: \"9561c92b-ec12-486a-9135-7bd652668deb\") " Mar 18 14:14:04 crc kubenswrapper[4921]: I0318 14:14:04.915440 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9561c92b-ec12-486a-9135-7bd652668deb-kube-api-access-4xzzc" (OuterVolumeSpecName: "kube-api-access-4xzzc") pod "9561c92b-ec12-486a-9135-7bd652668deb" (UID: "9561c92b-ec12-486a-9135-7bd652668deb"). InnerVolumeSpecName "kube-api-access-4xzzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:14:05 crc kubenswrapper[4921]: I0318 14:14:05.012229 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzzc\" (UniqueName: \"kubernetes.io/projected/9561c92b-ec12-486a-9135-7bd652668deb-kube-api-access-4xzzc\") on node \"crc\" DevicePath \"\"" Mar 18 14:14:05 crc kubenswrapper[4921]: I0318 14:14:05.462534 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" event={"ID":"9561c92b-ec12-486a-9135-7bd652668deb","Type":"ContainerDied","Data":"4e70ed4d013d5c4571dee66fdfa29ce31c1b279b7297675b0cc44ad5aabe8a9d"} Mar 18 14:14:05 crc kubenswrapper[4921]: I0318 14:14:05.462582 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e70ed4d013d5c4571dee66fdfa29ce31c1b279b7297675b0cc44ad5aabe8a9d" Mar 18 14:14:05 crc kubenswrapper[4921]: I0318 14:14:05.462645 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564054-x4jlk" Mar 18 14:14:05 crc kubenswrapper[4921]: I0318 14:14:05.887179 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-hrw97"] Mar 18 14:14:05 crc kubenswrapper[4921]: I0318 14:14:05.895023 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564048-hrw97"] Mar 18 14:14:07 crc kubenswrapper[4921]: I0318 14:14:07.224089 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b71ff6-45a1-4676-a640-daa488a2985b" path="/var/lib/kubelet/pods/b8b71ff6-45a1-4676-a640-daa488a2985b/volumes" Mar 18 14:14:16 crc kubenswrapper[4921]: I0318 14:14:16.189027 4921 scope.go:117] "RemoveContainer" containerID="49c50ab2154ad5d0ff223eb07adfd20bb84e9ac1aaeb9c24eb566f61e126a1d0" Mar 18 14:14:47 crc kubenswrapper[4921]: I0318 14:14:47.081749 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:14:47 crc kubenswrapper[4921]: I0318 14:14:47.082421 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.203368 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb"] Mar 18 14:15:00 crc kubenswrapper[4921]: E0318 14:15:00.204951 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9561c92b-ec12-486a-9135-7bd652668deb" containerName="oc" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.204968 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9561c92b-ec12-486a-9135-7bd652668deb" containerName="oc" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.205259 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9561c92b-ec12-486a-9135-7bd652668deb" containerName="oc" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.206323 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.209708 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.209971 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.238544 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb"] Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.332984 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bkk\" (UniqueName: \"kubernetes.io/projected/d66d47bd-47c8-4067-bf4b-c47fc4cec112-kube-api-access-v5bkk\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.333365 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66d47bd-47c8-4067-bf4b-c47fc4cec112-config-volume\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.333445 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66d47bd-47c8-4067-bf4b-c47fc4cec112-secret-volume\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.435202 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bkk\" (UniqueName: \"kubernetes.io/projected/d66d47bd-47c8-4067-bf4b-c47fc4cec112-kube-api-access-v5bkk\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.435261 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66d47bd-47c8-4067-bf4b-c47fc4cec112-config-volume\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.435294 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66d47bd-47c8-4067-bf4b-c47fc4cec112-secret-volume\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.436486 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66d47bd-47c8-4067-bf4b-c47fc4cec112-config-volume\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.440756 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66d47bd-47c8-4067-bf4b-c47fc4cec112-secret-volume\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.459680 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bkk\" (UniqueName: \"kubernetes.io/projected/d66d47bd-47c8-4067-bf4b-c47fc4cec112-kube-api-access-v5bkk\") pod \"collect-profiles-29564055-4tpxb\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:00 crc kubenswrapper[4921]: I0318 14:15:00.529598 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:01 crc kubenswrapper[4921]: I0318 14:15:01.021805 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb"] Mar 18 14:15:02 crc kubenswrapper[4921]: I0318 14:15:02.049644 4921 generic.go:334] "Generic (PLEG): container finished" podID="d66d47bd-47c8-4067-bf4b-c47fc4cec112" containerID="f4399ae68b772938b461b0fe295d8f651010c5e0093879815f53b591e8211100" exitCode=0 Mar 18 14:15:02 crc kubenswrapper[4921]: I0318 14:15:02.049684 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" event={"ID":"d66d47bd-47c8-4067-bf4b-c47fc4cec112","Type":"ContainerDied","Data":"f4399ae68b772938b461b0fe295d8f651010c5e0093879815f53b591e8211100"} Mar 18 14:15:02 crc kubenswrapper[4921]: I0318 14:15:02.050286 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" event={"ID":"d66d47bd-47c8-4067-bf4b-c47fc4cec112","Type":"ContainerStarted","Data":"106c6cd2823094b7d42b9b53556f708eb57ddb2ecb5cd500b37de9b2b11724cc"} Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.550840 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.709283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bkk\" (UniqueName: \"kubernetes.io/projected/d66d47bd-47c8-4067-bf4b-c47fc4cec112-kube-api-access-v5bkk\") pod \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.709387 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66d47bd-47c8-4067-bf4b-c47fc4cec112-secret-volume\") pod \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.709457 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66d47bd-47c8-4067-bf4b-c47fc4cec112-config-volume\") pod \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\" (UID: \"d66d47bd-47c8-4067-bf4b-c47fc4cec112\") " Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.710641 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66d47bd-47c8-4067-bf4b-c47fc4cec112-config-volume" (OuterVolumeSpecName: "config-volume") pod "d66d47bd-47c8-4067-bf4b-c47fc4cec112" (UID: "d66d47bd-47c8-4067-bf4b-c47fc4cec112"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.720370 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66d47bd-47c8-4067-bf4b-c47fc4cec112-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d66d47bd-47c8-4067-bf4b-c47fc4cec112" (UID: "d66d47bd-47c8-4067-bf4b-c47fc4cec112"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.720442 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66d47bd-47c8-4067-bf4b-c47fc4cec112-kube-api-access-v5bkk" (OuterVolumeSpecName: "kube-api-access-v5bkk") pod "d66d47bd-47c8-4067-bf4b-c47fc4cec112" (UID: "d66d47bd-47c8-4067-bf4b-c47fc4cec112"). InnerVolumeSpecName "kube-api-access-v5bkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.812663 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5bkk\" (UniqueName: \"kubernetes.io/projected/d66d47bd-47c8-4067-bf4b-c47fc4cec112-kube-api-access-v5bkk\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.812710 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66d47bd-47c8-4067-bf4b-c47fc4cec112-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:03 crc kubenswrapper[4921]: I0318 14:15:03.812725 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66d47bd-47c8-4067-bf4b-c47fc4cec112-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:04 crc kubenswrapper[4921]: I0318 14:15:04.069827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" event={"ID":"d66d47bd-47c8-4067-bf4b-c47fc4cec112","Type":"ContainerDied","Data":"106c6cd2823094b7d42b9b53556f708eb57ddb2ecb5cd500b37de9b2b11724cc"} Mar 18 14:15:04 crc kubenswrapper[4921]: I0318 14:15:04.070215 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="106c6cd2823094b7d42b9b53556f708eb57ddb2ecb5cd500b37de9b2b11724cc" Mar 18 14:15:04 crc kubenswrapper[4921]: I0318 14:15:04.069870 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564055-4tpxb" Mar 18 14:15:04 crc kubenswrapper[4921]: I0318 14:15:04.637732 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6"] Mar 18 14:15:04 crc kubenswrapper[4921]: I0318 14:15:04.647549 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-qxjb6"] Mar 18 14:15:05 crc kubenswrapper[4921]: I0318 14:15:05.972912 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fc77ad-30a6-4212-ac1e-10df854673d4" path="/var/lib/kubelet/pods/c2fc77ad-30a6-4212-ac1e-10df854673d4/volumes" Mar 18 14:15:14 crc kubenswrapper[4921]: I0318 14:15:14.173388 4921 generic.go:334] "Generic (PLEG): container finished" podID="57495fff-4b60-4b94-91ba-f9c34a6afd6c" containerID="4145cddedf4b1fe1b19557244b380d032493a3bbeaa099740ee2bd8bbaacb311" exitCode=0 Mar 18 14:15:14 crc kubenswrapper[4921]: I0318 14:15:14.173480 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" event={"ID":"57495fff-4b60-4b94-91ba-f9c34a6afd6c","Type":"ContainerDied","Data":"4145cddedf4b1fe1b19557244b380d032493a3bbeaa099740ee2bd8bbaacb311"} Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.747568 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.825002 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-inventory\") pod \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.825129 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmk89\" (UniqueName: \"kubernetes.io/projected/57495fff-4b60-4b94-91ba-f9c34a6afd6c-kube-api-access-vmk89\") pod \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.825196 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ssh-key-openstack-cell1\") pod \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.825232 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ceph\") pod \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\" (UID: \"57495fff-4b60-4b94-91ba-f9c34a6afd6c\") " Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.831453 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57495fff-4b60-4b94-91ba-f9c34a6afd6c-kube-api-access-vmk89" (OuterVolumeSpecName: "kube-api-access-vmk89") pod "57495fff-4b60-4b94-91ba-f9c34a6afd6c" (UID: "57495fff-4b60-4b94-91ba-f9c34a6afd6c"). InnerVolumeSpecName "kube-api-access-vmk89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.836187 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ceph" (OuterVolumeSpecName: "ceph") pod "57495fff-4b60-4b94-91ba-f9c34a6afd6c" (UID: "57495fff-4b60-4b94-91ba-f9c34a6afd6c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.859373 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "57495fff-4b60-4b94-91ba-f9c34a6afd6c" (UID: "57495fff-4b60-4b94-91ba-f9c34a6afd6c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.863329 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-inventory" (OuterVolumeSpecName: "inventory") pod "57495fff-4b60-4b94-91ba-f9c34a6afd6c" (UID: "57495fff-4b60-4b94-91ba-f9c34a6afd6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.928769 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.929095 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmk89\" (UniqueName: \"kubernetes.io/projected/57495fff-4b60-4b94-91ba-f9c34a6afd6c-kube-api-access-vmk89\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.929130 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:15 crc kubenswrapper[4921]: I0318 14:15:15.929144 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57495fff-4b60-4b94-91ba-f9c34a6afd6c-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.197519 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" event={"ID":"57495fff-4b60-4b94-91ba-f9c34a6afd6c","Type":"ContainerDied","Data":"508234c7e9bf8ec943113ba616d928e3f81fb63e32c370765a8bb49f043b1cb1"} Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.197580 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508234c7e9bf8ec943113ba616d928e3f81fb63e32c370765a8bb49f043b1cb1" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.197959 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mpg2w" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.307500 4921 scope.go:117] "RemoveContainer" containerID="442a1e494b4f5a6b449196769045786fbeeb468ad31f06294237bd669c6bb640" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.320534 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-wdz5v"] Mar 18 14:15:16 crc kubenswrapper[4921]: E0318 14:15:16.322916 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66d47bd-47c8-4067-bf4b-c47fc4cec112" containerName="collect-profiles" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.322942 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66d47bd-47c8-4067-bf4b-c47fc4cec112" containerName="collect-profiles" Mar 18 14:15:16 crc kubenswrapper[4921]: E0318 14:15:16.322958 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57495fff-4b60-4b94-91ba-f9c34a6afd6c" containerName="download-cache-openstack-openstack-cell1" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.322966 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="57495fff-4b60-4b94-91ba-f9c34a6afd6c" containerName="download-cache-openstack-openstack-cell1" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.323920 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="57495fff-4b60-4b94-91ba-f9c34a6afd6c" containerName="download-cache-openstack-openstack-cell1" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.323945 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66d47bd-47c8-4067-bf4b-c47fc4cec112" containerName="collect-profiles" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.327590 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.331837 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.332205 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.332606 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.332961 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.341823 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nqwh\" (UniqueName: \"kubernetes.io/projected/8591ea69-221d-49f3-be00-607522d37c6e-kube-api-access-7nqwh\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.341915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ceph\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.341957 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.342097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-inventory\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.373181 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-wdz5v"] Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.444141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nqwh\" (UniqueName: \"kubernetes.io/projected/8591ea69-221d-49f3-be00-607522d37c6e-kube-api-access-7nqwh\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.444190 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ceph\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.444218 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.444270 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-inventory\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.448789 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.449072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ceph\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.450245 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-inventory\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.463617 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nqwh\" (UniqueName: \"kubernetes.io/projected/8591ea69-221d-49f3-be00-607522d37c6e-kube-api-access-7nqwh\") pod \"configure-network-openstack-openstack-cell1-wdz5v\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:16 crc kubenswrapper[4921]: I0318 14:15:16.697371 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:15:17 crc kubenswrapper[4921]: I0318 14:15:17.081176 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:15:17 crc kubenswrapper[4921]: I0318 14:15:17.081517 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:15:17 crc kubenswrapper[4921]: I0318 14:15:17.281944 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-wdz5v"] Mar 18 14:15:18 crc kubenswrapper[4921]: I0318 14:15:18.217242 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" event={"ID":"8591ea69-221d-49f3-be00-607522d37c6e","Type":"ContainerStarted","Data":"c2cb73a6dedf1fab654b8e6153e4d0d14e367a9111242b9d278081171659b744"} Mar 18 14:15:18 crc kubenswrapper[4921]: I0318 14:15:18.217770 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" event={"ID":"8591ea69-221d-49f3-be00-607522d37c6e","Type":"ContainerStarted","Data":"b01562b5f1a763c79dcaacaa80c5a71c1a445d36b4f960cb40e858a0e00542c2"} Mar 18 14:15:18 crc kubenswrapper[4921]: I0318 14:15:18.241468 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" podStartSLOduration=2.067079991 podStartE2EDuration="2.241440707s" podCreationTimestamp="2026-03-18 14:15:16 +0000 UTC" firstStartedPulling="2026-03-18 14:15:17.292458545 +0000 UTC m=+7536.842379184" lastFinishedPulling="2026-03-18 14:15:17.466819261 +0000 UTC m=+7537.016739900" observedRunningTime="2026-03-18 14:15:18.23690376 +0000 UTC m=+7537.786824419" watchObservedRunningTime="2026-03-18 14:15:18.241440707 +0000 UTC m=+7537.791361356" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.510036 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4n6fz"] Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.518543 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.542917 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n6fz"] Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.676959 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-utilities\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.677204 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfhgl\" (UniqueName: \"kubernetes.io/projected/a082812a-5f6d-475e-8b56-f9e35ff7f705-kube-api-access-pfhgl\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.677388 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-catalog-content\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.780008 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-utilities\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.780163 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfhgl\" (UniqueName: \"kubernetes.io/projected/a082812a-5f6d-475e-8b56-f9e35ff7f705-kube-api-access-pfhgl\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.780196 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-catalog-content\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.780683 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-utilities\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.780810 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-catalog-content\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.806715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfhgl\" (UniqueName: \"kubernetes.io/projected/a082812a-5f6d-475e-8b56-f9e35ff7f705-kube-api-access-pfhgl\") pod \"redhat-marketplace-4n6fz\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:26 crc kubenswrapper[4921]: I0318 14:15:26.845589 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:27 crc kubenswrapper[4921]: I0318 14:15:27.418405 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n6fz"] Mar 18 14:15:28 crc kubenswrapper[4921]: I0318 14:15:28.336040 4921 generic.go:334] "Generic (PLEG): container finished" podID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerID="7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8" exitCode=0 Mar 18 14:15:28 crc kubenswrapper[4921]: I0318 14:15:28.338528 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerDied","Data":"7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8"} Mar 18 14:15:28 crc kubenswrapper[4921]: I0318 14:15:28.338696 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerStarted","Data":"1f702fd604787cd28b06167d35843e6d7977685e67353f890905acbf78e8b1cc"} Mar 18 14:15:29 crc kubenswrapper[4921]: I0318 14:15:29.348663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerStarted","Data":"73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc"} Mar 18 14:15:30 crc kubenswrapper[4921]: I0318 14:15:30.359956 4921 generic.go:334] "Generic (PLEG): container finished" podID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerID="73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc" exitCode=0 Mar 18 14:15:30 crc kubenswrapper[4921]: I0318 14:15:30.360347 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerDied","Data":"73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc"} Mar 18 14:15:31 crc kubenswrapper[4921]: I0318 14:15:31.397049 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerStarted","Data":"a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773"} Mar 18 14:15:31 crc kubenswrapper[4921]: I0318 14:15:31.426293 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4n6fz" podStartSLOduration=2.950714739 podStartE2EDuration="5.426274759s" podCreationTimestamp="2026-03-18 14:15:26 +0000 UTC" firstStartedPulling="2026-03-18 14:15:28.34051881 +0000 UTC m=+7547.890439449" lastFinishedPulling="2026-03-18 14:15:30.81607883 +0000 UTC m=+7550.365999469" observedRunningTime="2026-03-18 14:15:31.417946007 +0000 UTC m=+7550.967866646" watchObservedRunningTime="2026-03-18 14:15:31.426274759 +0000 UTC m=+7550.976195398" Mar 18 14:15:36 crc kubenswrapper[4921]: I0318 14:15:36.846290 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:36 crc kubenswrapper[4921]: I0318 14:15:36.846821 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:37 crc kubenswrapper[4921]: I0318 14:15:37.895152 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-4n6fz" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="registry-server" probeResult="failure" output=< Mar 18 14:15:37 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 14:15:37 crc kubenswrapper[4921]: > Mar 18 14:15:46 crc kubenswrapper[4921]: I0318 14:15:46.914740 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:46 crc kubenswrapper[4921]: I0318 14:15:46.970694 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.081839 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.082322 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.082495 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.083533 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"765723c54e70151d9ddbc2c17504b39000a3457302886d7aa69d6270fb6acb47"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.083723 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://765723c54e70151d9ddbc2c17504b39000a3457302886d7aa69d6270fb6acb47" gracePeriod=600 Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.151250 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n6fz"] Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.582934 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="765723c54e70151d9ddbc2c17504b39000a3457302886d7aa69d6270fb6acb47" exitCode=0 Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.583023 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"765723c54e70151d9ddbc2c17504b39000a3457302886d7aa69d6270fb6acb47"} Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.583315 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88"} Mar 18 14:15:47 crc kubenswrapper[4921]: I0318 14:15:47.583333 4921 scope.go:117] "RemoveContainer" containerID="b5fb32db4c8b23ff54f2855f9f8e03509c8d5ac714c33819dcd7f6aef9b14354" Mar 18 14:15:48 crc kubenswrapper[4921]: I0318 14:15:48.594314 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4n6fz" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="registry-server" containerID="cri-o://a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773" gracePeriod=2 Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.353709 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.530269 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfhgl\" (UniqueName: \"kubernetes.io/projected/a082812a-5f6d-475e-8b56-f9e35ff7f705-kube-api-access-pfhgl\") pod \"a082812a-5f6d-475e-8b56-f9e35ff7f705\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.530619 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-catalog-content\") pod \"a082812a-5f6d-475e-8b56-f9e35ff7f705\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.530922 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-utilities\") pod \"a082812a-5f6d-475e-8b56-f9e35ff7f705\" (UID: \"a082812a-5f6d-475e-8b56-f9e35ff7f705\") " Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.532549 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-utilities" (OuterVolumeSpecName: "utilities") pod "a082812a-5f6d-475e-8b56-f9e35ff7f705" (UID: "a082812a-5f6d-475e-8b56-f9e35ff7f705"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.549539 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a082812a-5f6d-475e-8b56-f9e35ff7f705-kube-api-access-pfhgl" (OuterVolumeSpecName: "kube-api-access-pfhgl") pod "a082812a-5f6d-475e-8b56-f9e35ff7f705" (UID: "a082812a-5f6d-475e-8b56-f9e35ff7f705"). InnerVolumeSpecName "kube-api-access-pfhgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.558535 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a082812a-5f6d-475e-8b56-f9e35ff7f705" (UID: "a082812a-5f6d-475e-8b56-f9e35ff7f705"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.610146 4921 generic.go:334] "Generic (PLEG): container finished" podID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerID="a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773" exitCode=0 Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.610234 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4n6fz" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.610230 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerDied","Data":"a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773"} Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.610619 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4n6fz" event={"ID":"a082812a-5f6d-475e-8b56-f9e35ff7f705","Type":"ContainerDied","Data":"1f702fd604787cd28b06167d35843e6d7977685e67353f890905acbf78e8b1cc"} Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.610667 4921 scope.go:117] "RemoveContainer" containerID="a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.633406 4921 scope.go:117] "RemoveContainer" containerID="73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.633946 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.633969 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfhgl\" (UniqueName: \"kubernetes.io/projected/a082812a-5f6d-475e-8b56-f9e35ff7f705-kube-api-access-pfhgl\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.633980 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a082812a-5f6d-475e-8b56-f9e35ff7f705-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.658231 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n6fz"] Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.667600 4921 scope.go:117] "RemoveContainer" containerID="7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.691150 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4n6fz"] Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.727344 4921 scope.go:117] "RemoveContainer" containerID="a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773" Mar 18 14:15:49 crc kubenswrapper[4921]: E0318 14:15:49.729268 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773\": container with ID starting with a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773 not found: ID does not exist" containerID="a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.729346 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773"} err="failed to get container status \"a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773\": rpc error: code = NotFound desc = could not find container \"a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773\": container with ID starting with a0a9135f730c4e175827962a349fee5d839b9ab4c176a70874e51b4cdce89773 not found: ID does not exist" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.729409 4921 scope.go:117] "RemoveContainer" containerID="73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc" Mar 18 14:15:49 crc kubenswrapper[4921]: E0318 14:15:49.729906 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc\": container with ID starting with 73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc not found: ID does not exist" containerID="73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.729953 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc"} err="failed to get container status \"73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc\": rpc error: code = NotFound desc = could not find container \"73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc\": container with ID starting with 73c0315b98032444ee28621be23c4814786e1885671815e2513fb06109b3f1fc not found: ID does not exist" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.729982 4921 scope.go:117] "RemoveContainer" containerID="7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8" Mar 18 14:15:49 crc kubenswrapper[4921]: E0318 14:15:49.730382 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8\": container with ID starting with 7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8 not found: ID does not exist" containerID="7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8" Mar 18 14:15:49 crc kubenswrapper[4921]: I0318 14:15:49.730423 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8"} err="failed to get container status \"7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8\": rpc error: code = NotFound desc = could not find container \"7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8\": container with ID starting with 7e60ce74f90aede9643af284df5f65bb8610eaeb6968078c3d5bfb282037cbf8 not found: ID does not exist" Mar 18 14:15:51 crc kubenswrapper[4921]: I0318 14:15:51.234480 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" path="/var/lib/kubelet/pods/a082812a-5f6d-475e-8b56-f9e35ff7f705/volumes" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.163467 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4pvlq"] Mar 18 14:16:00 crc kubenswrapper[4921]: E0318 14:16:00.166316 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="registry-server" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.166425 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="registry-server" Mar 18 14:16:00 crc kubenswrapper[4921]: E0318 14:16:00.166522 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="extract-content" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.166588 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="extract-content" Mar 18 14:16:00 crc kubenswrapper[4921]: E0318 14:16:00.166659 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="extract-utilities" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.166715 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="extract-utilities" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.167008 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a082812a-5f6d-475e-8b56-f9e35ff7f705" containerName="registry-server" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.167995 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.174287 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4pvlq"] Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.174953 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.175212 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.175289 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.357966 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8fkf\" (UniqueName: \"kubernetes.io/projected/ffdc395d-7849-4de7-ae87-38e625fc5774-kube-api-access-s8fkf\") pod \"auto-csr-approver-29564056-4pvlq\" (UID: \"ffdc395d-7849-4de7-ae87-38e625fc5774\") " pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.460035 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8fkf\" (UniqueName: \"kubernetes.io/projected/ffdc395d-7849-4de7-ae87-38e625fc5774-kube-api-access-s8fkf\") pod \"auto-csr-approver-29564056-4pvlq\" (UID: \"ffdc395d-7849-4de7-ae87-38e625fc5774\") " pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.482629 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8fkf\" (UniqueName: \"kubernetes.io/projected/ffdc395d-7849-4de7-ae87-38e625fc5774-kube-api-access-s8fkf\") pod \"auto-csr-approver-29564056-4pvlq\" (UID: \"ffdc395d-7849-4de7-ae87-38e625fc5774\") " pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.490488 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:00 crc kubenswrapper[4921]: I0318 14:16:00.997138 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4pvlq"] Mar 18 14:16:01 crc kubenswrapper[4921]: I0318 14:16:01.733586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" event={"ID":"ffdc395d-7849-4de7-ae87-38e625fc5774","Type":"ContainerStarted","Data":"9c1526d10105d0de04d89c51ed0992b32a8135c195387452f3522264c3028edd"} Mar 18 14:16:02 crc kubenswrapper[4921]: I0318 14:16:02.743533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" event={"ID":"ffdc395d-7849-4de7-ae87-38e625fc5774","Type":"ContainerStarted","Data":"6682d561b6ce4ce7ef70d751e08f812ac3ca9878532fce956be3230c17751412"} Mar 18 14:16:02 crc kubenswrapper[4921]: I0318 14:16:02.773335 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" podStartSLOduration=1.51937145 podStartE2EDuration="2.773316084s" podCreationTimestamp="2026-03-18 14:16:00 +0000 UTC" firstStartedPulling="2026-03-18 14:16:01.023220603 +0000 UTC m=+7580.573141242" lastFinishedPulling="2026-03-18 14:16:02.277165237 +0000 UTC m=+7581.827085876" observedRunningTime="2026-03-18 14:16:02.764743505 +0000 UTC m=+7582.314664134" watchObservedRunningTime="2026-03-18 14:16:02.773316084 +0000 UTC m=+7582.323236723" Mar 18 14:16:03 crc kubenswrapper[4921]: I0318 14:16:03.755463 4921 generic.go:334] "Generic (PLEG): container finished" podID="ffdc395d-7849-4de7-ae87-38e625fc5774" containerID="6682d561b6ce4ce7ef70d751e08f812ac3ca9878532fce956be3230c17751412" exitCode=0 Mar 18 14:16:03 crc kubenswrapper[4921]: I0318 14:16:03.755566 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" event={"ID":"ffdc395d-7849-4de7-ae87-38e625fc5774","Type":"ContainerDied","Data":"6682d561b6ce4ce7ef70d751e08f812ac3ca9878532fce956be3230c17751412"} Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.345504 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.492309 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8fkf\" (UniqueName: \"kubernetes.io/projected/ffdc395d-7849-4de7-ae87-38e625fc5774-kube-api-access-s8fkf\") pod \"ffdc395d-7849-4de7-ae87-38e625fc5774\" (UID: \"ffdc395d-7849-4de7-ae87-38e625fc5774\") " Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.505469 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffdc395d-7849-4de7-ae87-38e625fc5774-kube-api-access-s8fkf" (OuterVolumeSpecName: "kube-api-access-s8fkf") pod "ffdc395d-7849-4de7-ae87-38e625fc5774" (UID: "ffdc395d-7849-4de7-ae87-38e625fc5774"). InnerVolumeSpecName "kube-api-access-s8fkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.594690 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8fkf\" (UniqueName: \"kubernetes.io/projected/ffdc395d-7849-4de7-ae87-38e625fc5774-kube-api-access-s8fkf\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.775650 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" event={"ID":"ffdc395d-7849-4de7-ae87-38e625fc5774","Type":"ContainerDied","Data":"9c1526d10105d0de04d89c51ed0992b32a8135c195387452f3522264c3028edd"} Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.776001 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1526d10105d0de04d89c51ed0992b32a8135c195387452f3522264c3028edd" Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.775735 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564056-4pvlq" Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.842784 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-sbgnf"] Mar 18 14:16:05 crc kubenswrapper[4921]: I0318 14:16:05.854251 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564050-sbgnf"] Mar 18 14:16:07 crc kubenswrapper[4921]: I0318 14:16:07.221224 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fda7ab-0c40-49ae-b010-ba96fdbb0574" path="/var/lib/kubelet/pods/43fda7ab-0c40-49ae-b010-ba96fdbb0574/volumes" Mar 18 14:16:16 crc kubenswrapper[4921]: I0318 14:16:16.405592 4921 scope.go:117] "RemoveContainer" containerID="cb4b671d78387f51106752cdd2e363a524c1d6148bdc28eb66f564d0d1d92819" Mar 18 14:16:39 crc kubenswrapper[4921]: I0318 14:16:39.103684 4921 generic.go:334] "Generic (PLEG): container finished" podID="8591ea69-221d-49f3-be00-607522d37c6e" containerID="c2cb73a6dedf1fab654b8e6153e4d0d14e367a9111242b9d278081171659b744" exitCode=0 Mar 18 14:16:39 crc kubenswrapper[4921]: I0318 14:16:39.103754 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" event={"ID":"8591ea69-221d-49f3-be00-607522d37c6e","Type":"ContainerDied","Data":"c2cb73a6dedf1fab654b8e6153e4d0d14e367a9111242b9d278081171659b744"} Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.633681 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.677761 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ceph\") pod \"8591ea69-221d-49f3-be00-607522d37c6e\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.677878 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ssh-key-openstack-cell1\") pod \"8591ea69-221d-49f3-be00-607522d37c6e\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.678017 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nqwh\" (UniqueName: \"kubernetes.io/projected/8591ea69-221d-49f3-be00-607522d37c6e-kube-api-access-7nqwh\") pod \"8591ea69-221d-49f3-be00-607522d37c6e\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.678112 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-inventory\") pod \"8591ea69-221d-49f3-be00-607522d37c6e\" (UID: \"8591ea69-221d-49f3-be00-607522d37c6e\") " Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.685290 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ceph" (OuterVolumeSpecName: "ceph") pod "8591ea69-221d-49f3-be00-607522d37c6e" (UID: "8591ea69-221d-49f3-be00-607522d37c6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.688951 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8591ea69-221d-49f3-be00-607522d37c6e-kube-api-access-7nqwh" (OuterVolumeSpecName: "kube-api-access-7nqwh") pod "8591ea69-221d-49f3-be00-607522d37c6e" (UID: "8591ea69-221d-49f3-be00-607522d37c6e"). InnerVolumeSpecName "kube-api-access-7nqwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.728451 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-inventory" (OuterVolumeSpecName: "inventory") pod "8591ea69-221d-49f3-be00-607522d37c6e" (UID: "8591ea69-221d-49f3-be00-607522d37c6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.738264 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8591ea69-221d-49f3-be00-607522d37c6e" (UID: "8591ea69-221d-49f3-be00-607522d37c6e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.781236 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nqwh\" (UniqueName: \"kubernetes.io/projected/8591ea69-221d-49f3-be00-607522d37c6e-kube-api-access-7nqwh\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.781280 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.781292 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:40 crc kubenswrapper[4921]: I0318 14:16:40.781301 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8591ea69-221d-49f3-be00-607522d37c6e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.142354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" event={"ID":"8591ea69-221d-49f3-be00-607522d37c6e","Type":"ContainerDied","Data":"b01562b5f1a763c79dcaacaa80c5a71c1a445d36b4f960cb40e858a0e00542c2"} Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.142908 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-wdz5v" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.142916 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b01562b5f1a763c79dcaacaa80c5a71c1a445d36b4f960cb40e858a0e00542c2" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.236317 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-95t9s"] Mar 18 14:16:41 crc kubenswrapper[4921]: E0318 14:16:41.236860 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8591ea69-221d-49f3-be00-607522d37c6e" containerName="configure-network-openstack-openstack-cell1" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.236879 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8591ea69-221d-49f3-be00-607522d37c6e" containerName="configure-network-openstack-openstack-cell1" Mar 18 14:16:41 crc kubenswrapper[4921]: E0318 14:16:41.236901 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdc395d-7849-4de7-ae87-38e625fc5774" containerName="oc" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.236909 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc395d-7849-4de7-ae87-38e625fc5774" containerName="oc" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.237144 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdc395d-7849-4de7-ae87-38e625fc5774" containerName="oc" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.237161 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8591ea69-221d-49f3-be00-607522d37c6e" containerName="configure-network-openstack-openstack-cell1" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.237940 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.240232 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.240483 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.240661 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.241080 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.246217 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-95t9s"] Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.292761 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ceph\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.292853 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-inventory\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.292932 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt8gr\" (UniqueName: \"kubernetes.io/projected/30d774e9-5ee7-42b1-8883-ddedecfcaa13-kube-api-access-dt8gr\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.292969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.395926 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ceph\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.396073 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-inventory\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.396197 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt8gr\" (UniqueName: \"kubernetes.io/projected/30d774e9-5ee7-42b1-8883-ddedecfcaa13-kube-api-access-dt8gr\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.396260 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.398195 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.398244 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.400842 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ceph\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.409302 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-inventory\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.409852 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.412093 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt8gr\" (UniqueName: \"kubernetes.io/projected/30d774e9-5ee7-42b1-8883-ddedecfcaa13-kube-api-access-dt8gr\") pod \"validate-network-openstack-openstack-cell1-95t9s\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.623870 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:16:41 crc kubenswrapper[4921]: I0318 14:16:41.632200 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:42 crc kubenswrapper[4921]: I0318 14:16:42.422281 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-95t9s"] Mar 18 14:16:43 crc kubenswrapper[4921]: I0318 14:16:43.164905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" event={"ID":"30d774e9-5ee7-42b1-8883-ddedecfcaa13","Type":"ContainerStarted","Data":"48b0b36b732d2a4815132731ca2b7af7be001f762a0efa3fb58893865946e8e2"} Mar 18 14:16:43 crc kubenswrapper[4921]: I0318 14:16:43.165276 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" event={"ID":"30d774e9-5ee7-42b1-8883-ddedecfcaa13","Type":"ContainerStarted","Data":"6122df3fbacf421a16f387e28a04f8b4e34e3d72e51e6cf8e278031156b5f1e9"} Mar 18 14:16:43 crc kubenswrapper[4921]: I0318 14:16:43.195888 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" podStartSLOduration=2.017019478 podStartE2EDuration="2.195867201s" podCreationTimestamp="2026-03-18 14:16:41 +0000 UTC" firstStartedPulling="2026-03-18 14:16:42.434337272 +0000 UTC m=+7621.984257911" lastFinishedPulling="2026-03-18 14:16:42.613184995 +0000 UTC m=+7622.163105634" observedRunningTime="2026-03-18 14:16:43.182255089 +0000 UTC m=+7622.732175738" watchObservedRunningTime="2026-03-18 14:16:43.195867201 +0000 UTC m=+7622.745787840" Mar 18 14:16:48 crc kubenswrapper[4921]: I0318 14:16:48.211980 4921 generic.go:334] "Generic (PLEG): container finished" podID="30d774e9-5ee7-42b1-8883-ddedecfcaa13" containerID="48b0b36b732d2a4815132731ca2b7af7be001f762a0efa3fb58893865946e8e2" exitCode=0 Mar 18 14:16:48 crc kubenswrapper[4921]: I0318 14:16:48.212064 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" event={"ID":"30d774e9-5ee7-42b1-8883-ddedecfcaa13","Type":"ContainerDied","Data":"48b0b36b732d2a4815132731ca2b7af7be001f762a0efa3fb58893865946e8e2"} Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.666860 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.782740 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt8gr\" (UniqueName: \"kubernetes.io/projected/30d774e9-5ee7-42b1-8883-ddedecfcaa13-kube-api-access-dt8gr\") pod \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.782987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-inventory\") pod \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.783041 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ceph\") pod \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.783126 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ssh-key-openstack-cell1\") pod \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\" (UID: \"30d774e9-5ee7-42b1-8883-ddedecfcaa13\") " Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.789190 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d774e9-5ee7-42b1-8883-ddedecfcaa13-kube-api-access-dt8gr" (OuterVolumeSpecName: "kube-api-access-dt8gr") pod "30d774e9-5ee7-42b1-8883-ddedecfcaa13" (UID: "30d774e9-5ee7-42b1-8883-ddedecfcaa13"). InnerVolumeSpecName "kube-api-access-dt8gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.796517 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ceph" (OuterVolumeSpecName: "ceph") pod "30d774e9-5ee7-42b1-8883-ddedecfcaa13" (UID: "30d774e9-5ee7-42b1-8883-ddedecfcaa13"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.814687 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "30d774e9-5ee7-42b1-8883-ddedecfcaa13" (UID: "30d774e9-5ee7-42b1-8883-ddedecfcaa13"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.824040 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-inventory" (OuterVolumeSpecName: "inventory") pod "30d774e9-5ee7-42b1-8883-ddedecfcaa13" (UID: "30d774e9-5ee7-42b1-8883-ddedecfcaa13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.888567 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.888605 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.888614 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30d774e9-5ee7-42b1-8883-ddedecfcaa13-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:49 crc kubenswrapper[4921]: I0318 14:16:49.888625 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt8gr\" (UniqueName: \"kubernetes.io/projected/30d774e9-5ee7-42b1-8883-ddedecfcaa13-kube-api-access-dt8gr\") on node \"crc\" DevicePath \"\"" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.232587 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" event={"ID":"30d774e9-5ee7-42b1-8883-ddedecfcaa13","Type":"ContainerDied","Data":"6122df3fbacf421a16f387e28a04f8b4e34e3d72e51e6cf8e278031156b5f1e9"} Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.232628 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6122df3fbacf421a16f387e28a04f8b4e34e3d72e51e6cf8e278031156b5f1e9" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.232680 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-95t9s" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.335972 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-fhklf"] Mar 18 14:16:50 crc kubenswrapper[4921]: E0318 14:16:50.336640 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d774e9-5ee7-42b1-8883-ddedecfcaa13" containerName="validate-network-openstack-openstack-cell1" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.336661 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d774e9-5ee7-42b1-8883-ddedecfcaa13" containerName="validate-network-openstack-openstack-cell1" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.336915 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d774e9-5ee7-42b1-8883-ddedecfcaa13" containerName="validate-network-openstack-openstack-cell1" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.337890 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.340524 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.340739 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.340870 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.341033 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.345856 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-fhklf"] Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.504234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ceph\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.504286 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55h65\" (UniqueName: \"kubernetes.io/projected/978bea76-f0d3-43d3-9a2e-e024cfd086fa-kube-api-access-55h65\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.504317 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-inventory\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.504964 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.606376 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.606511 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ceph\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.606540 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55h65\" (UniqueName: \"kubernetes.io/projected/978bea76-f0d3-43d3-9a2e-e024cfd086fa-kube-api-access-55h65\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.606566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-inventory\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.612331 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.626853 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ceph\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.627063 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-inventory\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.629730 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55h65\" (UniqueName: \"kubernetes.io/projected/978bea76-f0d3-43d3-9a2e-e024cfd086fa-kube-api-access-55h65\") pod \"install-os-openstack-openstack-cell1-fhklf\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:50 crc kubenswrapper[4921]: I0318 14:16:50.666948 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:16:51 crc kubenswrapper[4921]: I0318 14:16:51.280752 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-fhklf"] Mar 18 14:16:52 crc kubenswrapper[4921]: I0318 14:16:52.252009 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fhklf" event={"ID":"978bea76-f0d3-43d3-9a2e-e024cfd086fa","Type":"ContainerStarted","Data":"d1b1d381ed4c8e80438eab9fec84c5f5000b9a04cf8d7cbfbc494ddaa013c411"} Mar 18 14:16:52 crc kubenswrapper[4921]: I0318 14:16:52.252584 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fhklf" event={"ID":"978bea76-f0d3-43d3-9a2e-e024cfd086fa","Type":"ContainerStarted","Data":"fcb656db3f02d66d2219bd9a026c0e6532b1c86cdd9e82b8eb00d3e699dede73"} Mar 18 14:16:52 crc kubenswrapper[4921]: I0318 14:16:52.276362 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-fhklf" podStartSLOduration=2.093947104 podStartE2EDuration="2.276339614s" podCreationTimestamp="2026-03-18 14:16:50 +0000 UTC" firstStartedPulling="2026-03-18 14:16:51.29231911 +0000 UTC m=+7630.842239749" lastFinishedPulling="2026-03-18 14:16:51.47471162 +0000 UTC m=+7631.024632259" observedRunningTime="2026-03-18 14:16:52.267203157 +0000 UTC m=+7631.817123846" watchObservedRunningTime="2026-03-18 14:16:52.276339614 +0000 UTC m=+7631.826260253" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.242344 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bxftd"] Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.245832 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.255847 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-catalog-content\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.256078 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-utilities\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.256227 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpn6f\" (UniqueName: \"kubernetes.io/projected/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-kube-api-access-qpn6f\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.262206 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxftd"] Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.357428 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-catalog-content\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.357666 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-utilities\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.357728 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpn6f\" (UniqueName: \"kubernetes.io/projected/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-kube-api-access-qpn6f\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.357967 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-catalog-content\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.358228 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-utilities\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.383506 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpn6f\" (UniqueName: \"kubernetes.io/projected/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-kube-api-access-qpn6f\") pod \"certified-operators-bxftd\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:14 crc kubenswrapper[4921]: I0318 14:17:14.581145 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:15 crc kubenswrapper[4921]: W0318 14:17:15.064490 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a87d6a7_6213_4dcc_8d0b_5d45e06840ad.slice/crio-abaa46d5cd2be3dc5857ff194427a9f07340dcff68fef42d484c8f4e43b331ff WatchSource:0}: Error finding container abaa46d5cd2be3dc5857ff194427a9f07340dcff68fef42d484c8f4e43b331ff: Status 404 returned error can't find the container with id abaa46d5cd2be3dc5857ff194427a9f07340dcff68fef42d484c8f4e43b331ff Mar 18 14:17:15 crc kubenswrapper[4921]: I0318 14:17:15.069742 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bxftd"] Mar 18 14:17:15 crc kubenswrapper[4921]: I0318 14:17:15.510328 4921 generic.go:334] "Generic (PLEG): container finished" podID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerID="9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f" exitCode=0 Mar 18 14:17:15 crc kubenswrapper[4921]: I0318 14:17:15.510382 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerDied","Data":"9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f"} Mar 18 14:17:15 crc kubenswrapper[4921]: I0318 14:17:15.510422 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerStarted","Data":"abaa46d5cd2be3dc5857ff194427a9f07340dcff68fef42d484c8f4e43b331ff"} Mar 18 14:17:18 crc kubenswrapper[4921]: I0318 14:17:18.543894 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerStarted","Data":"e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3"} Mar 18 14:17:19 crc kubenswrapper[4921]: I0318 14:17:19.554897 4921 generic.go:334] "Generic (PLEG): container finished" podID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerID="e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3" exitCode=0 Mar 18 14:17:19 crc kubenswrapper[4921]: I0318 14:17:19.555181 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerDied","Data":"e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3"} Mar 18 14:17:20 crc kubenswrapper[4921]: I0318 14:17:20.574940 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerStarted","Data":"ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9"} Mar 18 14:17:20 crc kubenswrapper[4921]: I0318 14:17:20.619528 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bxftd" podStartSLOduration=2.142939814 podStartE2EDuration="6.619501319s" podCreationTimestamp="2026-03-18 14:17:14 +0000 UTC" firstStartedPulling="2026-03-18 14:17:15.513025938 +0000 UTC m=+7655.062946577" lastFinishedPulling="2026-03-18 14:17:19.989587423 +0000 UTC m=+7659.539508082" observedRunningTime="2026-03-18 14:17:20.607685571 +0000 UTC m=+7660.157606210" watchObservedRunningTime="2026-03-18 14:17:20.619501319 +0000 UTC m=+7660.169421968" Mar 18 14:17:24 crc kubenswrapper[4921]: I0318 14:17:24.581667 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:24 crc kubenswrapper[4921]: I0318 14:17:24.582239 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:24 crc kubenswrapper[4921]: I0318 14:17:24.647647 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:34 crc kubenswrapper[4921]: I0318 14:17:34.637185 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:34 crc kubenswrapper[4921]: I0318 14:17:34.686634 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxftd"] Mar 18 14:17:34 crc kubenswrapper[4921]: I0318 14:17:34.723276 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bxftd" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="registry-server" containerID="cri-o://ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9" gracePeriod=2 Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.273060 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.378181 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-catalog-content\") pod \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.378262 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpn6f\" (UniqueName: \"kubernetes.io/projected/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-kube-api-access-qpn6f\") pod \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.378629 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-utilities\") pod \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\" (UID: \"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad\") " Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.379995 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-utilities" (OuterVolumeSpecName: "utilities") pod "9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" (UID: "9a87d6a7-6213-4dcc-8d0b-5d45e06840ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.387840 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-kube-api-access-qpn6f" (OuterVolumeSpecName: "kube-api-access-qpn6f") pod "9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" (UID: "9a87d6a7-6213-4dcc-8d0b-5d45e06840ad"). InnerVolumeSpecName "kube-api-access-qpn6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.440475 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" (UID: "9a87d6a7-6213-4dcc-8d0b-5d45e06840ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.481284 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.481325 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.481343 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpn6f\" (UniqueName: \"kubernetes.io/projected/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad-kube-api-access-qpn6f\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.733445 4921 generic.go:334] "Generic (PLEG): container finished" podID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerID="ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9" exitCode=0 Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.733498 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerDied","Data":"ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9"} Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.733527 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bxftd" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.733561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bxftd" event={"ID":"9a87d6a7-6213-4dcc-8d0b-5d45e06840ad","Type":"ContainerDied","Data":"abaa46d5cd2be3dc5857ff194427a9f07340dcff68fef42d484c8f4e43b331ff"} Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.733579 4921 scope.go:117] "RemoveContainer" containerID="ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.759031 4921 scope.go:117] "RemoveContainer" containerID="e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.771982 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bxftd"] Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.781269 4921 scope.go:117] "RemoveContainer" containerID="9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.783069 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bxftd"] Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.837076 4921 scope.go:117] "RemoveContainer" containerID="ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9" Mar 18 14:17:35 crc kubenswrapper[4921]: E0318 14:17:35.837553 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9\": container with ID starting with ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9 not found: ID does not exist" containerID="ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.837583 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9"} err="failed to get container status \"ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9\": rpc error: code = NotFound desc = could not find container \"ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9\": container with ID starting with ee80b6e14ce2b97142b3347913ba6ed68362e67dccf0d4285135f883ea3a8ef9 not found: ID does not exist" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.837601 4921 scope.go:117] "RemoveContainer" containerID="e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3" Mar 18 14:17:35 crc kubenswrapper[4921]: E0318 14:17:35.837918 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3\": container with ID starting with e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3 not found: ID does not exist" containerID="e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.837982 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3"} err="failed to get container status \"e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3\": rpc error: code = NotFound desc = could not find container \"e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3\": container with ID starting with e43362e522a095f0b4e1fe7b293667dcffec8468cef9ee5e372a3811656e79d3 not found: ID does not exist" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.838012 4921 scope.go:117] "RemoveContainer" containerID="9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f" Mar 18 14:17:35 crc kubenswrapper[4921]: E0318 14:17:35.838813 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f\": container with ID starting with 9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f not found: ID does not exist" containerID="9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f" Mar 18 14:17:35 crc kubenswrapper[4921]: I0318 14:17:35.838839 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f"} err="failed to get container status \"9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f\": rpc error: code = NotFound desc = could not find container \"9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f\": container with ID starting with 9006e2b7ae96c002e4753a9f7cd25a3707993697a134ae113bb8fd5c64a8899f not found: ID does not exist" Mar 18 14:17:37 crc kubenswrapper[4921]: I0318 14:17:37.221463 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" path="/var/lib/kubelet/pods/9a87d6a7-6213-4dcc-8d0b-5d45e06840ad/volumes" Mar 18 14:17:40 crc kubenswrapper[4921]: I0318 14:17:40.783774 4921 generic.go:334] "Generic (PLEG): container finished" podID="978bea76-f0d3-43d3-9a2e-e024cfd086fa" containerID="d1b1d381ed4c8e80438eab9fec84c5f5000b9a04cf8d7cbfbc494ddaa013c411" exitCode=0 Mar 18 14:17:40 crc kubenswrapper[4921]: I0318 14:17:40.783859 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fhklf" event={"ID":"978bea76-f0d3-43d3-9a2e-e024cfd086fa","Type":"ContainerDied","Data":"d1b1d381ed4c8e80438eab9fec84c5f5000b9a04cf8d7cbfbc494ddaa013c411"} Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.242576 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.336802 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-inventory\") pod \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.336926 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ceph\") pod \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.336982 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55h65\" (UniqueName: \"kubernetes.io/projected/978bea76-f0d3-43d3-9a2e-e024cfd086fa-kube-api-access-55h65\") pod \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.337060 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ssh-key-openstack-cell1\") pod \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\" (UID: \"978bea76-f0d3-43d3-9a2e-e024cfd086fa\") " Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.344078 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978bea76-f0d3-43d3-9a2e-e024cfd086fa-kube-api-access-55h65" (OuterVolumeSpecName: "kube-api-access-55h65") pod "978bea76-f0d3-43d3-9a2e-e024cfd086fa" (UID: "978bea76-f0d3-43d3-9a2e-e024cfd086fa"). InnerVolumeSpecName "kube-api-access-55h65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.345798 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ceph" (OuterVolumeSpecName: "ceph") pod "978bea76-f0d3-43d3-9a2e-e024cfd086fa" (UID: "978bea76-f0d3-43d3-9a2e-e024cfd086fa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.372751 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-inventory" (OuterVolumeSpecName: "inventory") pod "978bea76-f0d3-43d3-9a2e-e024cfd086fa" (UID: "978bea76-f0d3-43d3-9a2e-e024cfd086fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.381618 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "978bea76-f0d3-43d3-9a2e-e024cfd086fa" (UID: "978bea76-f0d3-43d3-9a2e-e024cfd086fa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.440042 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.440082 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.440092 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55h65\" (UniqueName: \"kubernetes.io/projected/978bea76-f0d3-43d3-9a2e-e024cfd086fa-kube-api-access-55h65\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.440104 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/978bea76-f0d3-43d3-9a2e-e024cfd086fa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.807590 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-fhklf" event={"ID":"978bea76-f0d3-43d3-9a2e-e024cfd086fa","Type":"ContainerDied","Data":"fcb656db3f02d66d2219bd9a026c0e6532b1c86cdd9e82b8eb00d3e699dede73"} Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.807929 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb656db3f02d66d2219bd9a026c0e6532b1c86cdd9e82b8eb00d3e699dede73" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.807688 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-fhklf" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.898500 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9gzc2"] Mar 18 14:17:42 crc kubenswrapper[4921]: E0318 14:17:42.899155 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="extract-utilities" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.899183 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="extract-utilities" Mar 18 14:17:42 crc kubenswrapper[4921]: E0318 14:17:42.899206 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="extract-content" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.899217 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="extract-content" Mar 18 14:17:42 crc kubenswrapper[4921]: E0318 14:17:42.899249 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978bea76-f0d3-43d3-9a2e-e024cfd086fa" containerName="install-os-openstack-openstack-cell1" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.899263 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="978bea76-f0d3-43d3-9a2e-e024cfd086fa" containerName="install-os-openstack-openstack-cell1" Mar 18 14:17:42 crc kubenswrapper[4921]: E0318 14:17:42.899288 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="registry-server" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.899299 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="registry-server" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.899735 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a87d6a7-6213-4dcc-8d0b-5d45e06840ad" containerName="registry-server" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.899761 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="978bea76-f0d3-43d3-9a2e-e024cfd086fa" containerName="install-os-openstack-openstack-cell1" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.900839 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.902927 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.903005 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.902942 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.904419 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.926588 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9gzc2"] Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.950486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smf6x\" (UniqueName: \"kubernetes.io/projected/501d322f-bf37-45bc-8604-0099a6408aac-kube-api-access-smf6x\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.950542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ceph\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.950562 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:42 crc kubenswrapper[4921]: I0318 14:17:42.950585 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-inventory\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.052518 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smf6x\" (UniqueName: \"kubernetes.io/projected/501d322f-bf37-45bc-8604-0099a6408aac-kube-api-access-smf6x\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.052586 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ceph\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.052615 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.052647 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-inventory\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.057900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.058270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-inventory\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.059313 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ceph\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.073650 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smf6x\" (UniqueName: \"kubernetes.io/projected/501d322f-bf37-45bc-8604-0099a6408aac-kube-api-access-smf6x\") pod \"configure-os-openstack-openstack-cell1-9gzc2\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.224495 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.806055 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9gzc2"] Mar 18 14:17:43 crc kubenswrapper[4921]: I0318 14:17:43.821460 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" event={"ID":"501d322f-bf37-45bc-8604-0099a6408aac","Type":"ContainerStarted","Data":"970c9fb8ffe4007bad6ee8ced143ab596094b272c08833a6b5cc603ee4fd46a5"} Mar 18 14:17:44 crc kubenswrapper[4921]: I0318 14:17:44.836303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" event={"ID":"501d322f-bf37-45bc-8604-0099a6408aac","Type":"ContainerStarted","Data":"eea85f71af2e4e2a3160b253f67ed7445716700f2f6542619c28502e032250ef"} Mar 18 14:17:47 crc kubenswrapper[4921]: I0318 14:17:47.081154 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:17:47 crc kubenswrapper[4921]: I0318 14:17:47.081824 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.155352 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" podStartSLOduration=17.987757345 podStartE2EDuration="18.155331017s" podCreationTimestamp="2026-03-18 14:17:42 +0000 UTC" firstStartedPulling="2026-03-18 14:17:43.796395504 +0000 UTC m=+7683.346316143" lastFinishedPulling="2026-03-18 14:17:43.963969166 +0000 UTC m=+7683.513889815" observedRunningTime="2026-03-18 14:17:44.85419135 +0000 UTC m=+7684.404111999" watchObservedRunningTime="2026-03-18 14:18:00.155331017 +0000 UTC m=+7699.705251656" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.160436 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564058-ch5hf"] Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.179654 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.185536 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.187213 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.187869 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.219787 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-ch5hf"] Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.234782 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgn9b\" (UniqueName: \"kubernetes.io/projected/4b552589-8f3a-4e6f-84f0-7a6067832f18-kube-api-access-cgn9b\") pod \"auto-csr-approver-29564058-ch5hf\" (UID: \"4b552589-8f3a-4e6f-84f0-7a6067832f18\") " pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.336542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgn9b\" (UniqueName: \"kubernetes.io/projected/4b552589-8f3a-4e6f-84f0-7a6067832f18-kube-api-access-cgn9b\") pod \"auto-csr-approver-29564058-ch5hf\" (UID: \"4b552589-8f3a-4e6f-84f0-7a6067832f18\") " pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.354186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgn9b\" (UniqueName: \"kubernetes.io/projected/4b552589-8f3a-4e6f-84f0-7a6067832f18-kube-api-access-cgn9b\") pod \"auto-csr-approver-29564058-ch5hf\" (UID: \"4b552589-8f3a-4e6f-84f0-7a6067832f18\") " pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:00 crc kubenswrapper[4921]: I0318 14:18:00.515735 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:01 crc kubenswrapper[4921]: I0318 14:18:01.027403 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-ch5hf"] Mar 18 14:18:02 crc kubenswrapper[4921]: I0318 14:18:02.016582 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" event={"ID":"4b552589-8f3a-4e6f-84f0-7a6067832f18","Type":"ContainerStarted","Data":"af9ee43a1f4d4930ffebb9de668df9445a92a32f1f92a31959d9e544fa47935e"} Mar 18 14:18:03 crc kubenswrapper[4921]: I0318 14:18:03.027329 4921 generic.go:334] "Generic (PLEG): container finished" podID="4b552589-8f3a-4e6f-84f0-7a6067832f18" containerID="73985e4ba40604e9c0f7b4bef7453aec80d645bd8bd8159e8af6801c87d2486a" exitCode=0 Mar 18 14:18:03 crc kubenswrapper[4921]: I0318 14:18:03.027382 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" event={"ID":"4b552589-8f3a-4e6f-84f0-7a6067832f18","Type":"ContainerDied","Data":"73985e4ba40604e9c0f7b4bef7453aec80d645bd8bd8159e8af6801c87d2486a"} Mar 18 14:18:04 crc kubenswrapper[4921]: I0318 14:18:04.407793 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:04 crc kubenswrapper[4921]: I0318 14:18:04.532861 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgn9b\" (UniqueName: \"kubernetes.io/projected/4b552589-8f3a-4e6f-84f0-7a6067832f18-kube-api-access-cgn9b\") pod \"4b552589-8f3a-4e6f-84f0-7a6067832f18\" (UID: \"4b552589-8f3a-4e6f-84f0-7a6067832f18\") " Mar 18 14:18:04 crc kubenswrapper[4921]: I0318 14:18:04.538632 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b552589-8f3a-4e6f-84f0-7a6067832f18-kube-api-access-cgn9b" (OuterVolumeSpecName: "kube-api-access-cgn9b") pod "4b552589-8f3a-4e6f-84f0-7a6067832f18" (UID: "4b552589-8f3a-4e6f-84f0-7a6067832f18"). InnerVolumeSpecName "kube-api-access-cgn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:04 crc kubenswrapper[4921]: I0318 14:18:04.635950 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgn9b\" (UniqueName: \"kubernetes.io/projected/4b552589-8f3a-4e6f-84f0-7a6067832f18-kube-api-access-cgn9b\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:05 crc kubenswrapper[4921]: I0318 14:18:05.048183 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" event={"ID":"4b552589-8f3a-4e6f-84f0-7a6067832f18","Type":"ContainerDied","Data":"af9ee43a1f4d4930ffebb9de668df9445a92a32f1f92a31959d9e544fa47935e"} Mar 18 14:18:05 crc kubenswrapper[4921]: I0318 14:18:05.048520 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9ee43a1f4d4930ffebb9de668df9445a92a32f1f92a31959d9e544fa47935e" Mar 18 14:18:05 crc kubenswrapper[4921]: I0318 14:18:05.048309 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564058-ch5hf" Mar 18 14:18:05 crc kubenswrapper[4921]: I0318 14:18:05.493999 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-zd4lh"] Mar 18 14:18:05 crc kubenswrapper[4921]: I0318 14:18:05.502832 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564052-zd4lh"] Mar 18 14:18:07 crc kubenswrapper[4921]: I0318 14:18:07.229991 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982ccb12-8974-497a-96d8-85f699f82204" path="/var/lib/kubelet/pods/982ccb12-8974-497a-96d8-85f699f82204/volumes" Mar 18 14:18:10 crc kubenswrapper[4921]: I0318 14:18:10.899744 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nwxhz"] Mar 18 14:18:10 crc kubenswrapper[4921]: E0318 14:18:10.900972 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b552589-8f3a-4e6f-84f0-7a6067832f18" containerName="oc" Mar 18 14:18:10 crc kubenswrapper[4921]: I0318 14:18:10.900992 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b552589-8f3a-4e6f-84f0-7a6067832f18" containerName="oc" Mar 18 14:18:10 crc kubenswrapper[4921]: I0318 14:18:10.901370 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b552589-8f3a-4e6f-84f0-7a6067832f18" containerName="oc" Mar 18 14:18:10 crc kubenswrapper[4921]: I0318 14:18:10.903853 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:10 crc kubenswrapper[4921]: I0318 14:18:10.917147 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwxhz"] Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.012995 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4r8\" (UniqueName: \"kubernetes.io/projected/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-kube-api-access-4d4r8\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.013443 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-catalog-content\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.013499 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-utilities\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.116682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-catalog-content\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.116735 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-utilities\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.116826 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4r8\" (UniqueName: \"kubernetes.io/projected/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-kube-api-access-4d4r8\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.117354 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-utilities\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.117415 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-catalog-content\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.137067 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4r8\" (UniqueName: \"kubernetes.io/projected/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-kube-api-access-4d4r8\") pod \"community-operators-nwxhz\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.229931 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:11 crc kubenswrapper[4921]: W0318 14:18:11.804654 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89adab88_32cb_4a2b_b2a0_ff2cd1831fad.slice/crio-7ee749842b0038b1969db3f36219ae0bdcb56023818a870fe700b4a64062ec1a WatchSource:0}: Error finding container 7ee749842b0038b1969db3f36219ae0bdcb56023818a870fe700b4a64062ec1a: Status 404 returned error can't find the container with id 7ee749842b0038b1969db3f36219ae0bdcb56023818a870fe700b4a64062ec1a Mar 18 14:18:11 crc kubenswrapper[4921]: I0318 14:18:11.806556 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwxhz"] Mar 18 14:18:12 crc kubenswrapper[4921]: I0318 14:18:12.113513 4921 generic.go:334] "Generic (PLEG): container finished" podID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerID="1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382" exitCode=0 Mar 18 14:18:12 crc kubenswrapper[4921]: I0318 14:18:12.113563 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerDied","Data":"1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382"} Mar 18 14:18:12 crc kubenswrapper[4921]: I0318 14:18:12.113595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerStarted","Data":"7ee749842b0038b1969db3f36219ae0bdcb56023818a870fe700b4a64062ec1a"} Mar 18 14:18:12 crc kubenswrapper[4921]: I0318 14:18:12.115279 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:18:14 crc kubenswrapper[4921]: I0318 14:18:14.135944 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerStarted","Data":"fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64"} Mar 18 14:18:15 crc kubenswrapper[4921]: I0318 14:18:15.148062 4921 generic.go:334] "Generic (PLEG): container finished" podID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerID="fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64" exitCode=0 Mar 18 14:18:15 crc kubenswrapper[4921]: I0318 14:18:15.148177 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerDied","Data":"fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64"} Mar 18 14:18:16 crc kubenswrapper[4921]: I0318 14:18:16.161875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerStarted","Data":"9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1"} Mar 18 14:18:16 crc kubenswrapper[4921]: I0318 14:18:16.186362 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nwxhz" podStartSLOduration=2.64345868 podStartE2EDuration="6.186343465s" podCreationTimestamp="2026-03-18 14:18:10 +0000 UTC" firstStartedPulling="2026-03-18 14:18:12.115038506 +0000 UTC m=+7711.664959145" lastFinishedPulling="2026-03-18 14:18:15.657923291 +0000 UTC m=+7715.207843930" observedRunningTime="2026-03-18 14:18:16.176350042 +0000 UTC m=+7715.726270691" watchObservedRunningTime="2026-03-18 14:18:16.186343465 +0000 UTC m=+7715.736264104" Mar 18 14:18:16 crc kubenswrapper[4921]: I0318 14:18:16.542229 4921 scope.go:117] "RemoveContainer" containerID="67203d3bd278ddbcf0dbff5fc39ffe09fcac385d80548180f4845f0c69f20ae1" Mar 18 14:18:17 crc kubenswrapper[4921]: I0318 14:18:17.081019 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:18:17 crc kubenswrapper[4921]: I0318 14:18:17.081129 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:18:21 crc kubenswrapper[4921]: I0318 14:18:21.236143 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:21 crc kubenswrapper[4921]: I0318 14:18:21.236693 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:21 crc kubenswrapper[4921]: I0318 14:18:21.275273 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:22 crc kubenswrapper[4921]: I0318 14:18:22.287776 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:22 crc kubenswrapper[4921]: I0318 14:18:22.335303 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwxhz"] Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.256340 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nwxhz" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="registry-server" containerID="cri-o://9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1" gracePeriod=2 Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.740707 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.818578 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-catalog-content\") pod \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.818768 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4r8\" (UniqueName: \"kubernetes.io/projected/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-kube-api-access-4d4r8\") pod \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.818885 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-utilities\") pod \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\" (UID: \"89adab88-32cb-4a2b-b2a0-ff2cd1831fad\") " Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.820048 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-utilities" (OuterVolumeSpecName: "utilities") pod "89adab88-32cb-4a2b-b2a0-ff2cd1831fad" (UID: "89adab88-32cb-4a2b-b2a0-ff2cd1831fad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.836905 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-kube-api-access-4d4r8" (OuterVolumeSpecName: "kube-api-access-4d4r8") pod "89adab88-32cb-4a2b-b2a0-ff2cd1831fad" (UID: "89adab88-32cb-4a2b-b2a0-ff2cd1831fad"). InnerVolumeSpecName "kube-api-access-4d4r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.878931 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89adab88-32cb-4a2b-b2a0-ff2cd1831fad" (UID: "89adab88-32cb-4a2b-b2a0-ff2cd1831fad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.921249 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.921297 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:24 crc kubenswrapper[4921]: I0318 14:18:24.921310 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4r8\" (UniqueName: \"kubernetes.io/projected/89adab88-32cb-4a2b-b2a0-ff2cd1831fad-kube-api-access-4d4r8\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.267760 4921 generic.go:334] "Generic (PLEG): container finished" podID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerID="9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1" exitCode=0 Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.267941 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerDied","Data":"9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1"} Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.268146 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwxhz" event={"ID":"89adab88-32cb-4a2b-b2a0-ff2cd1831fad","Type":"ContainerDied","Data":"7ee749842b0038b1969db3f36219ae0bdcb56023818a870fe700b4a64062ec1a"} Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.268172 4921 scope.go:117] "RemoveContainer" containerID="9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.268030 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwxhz" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.295020 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwxhz"] Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.303626 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nwxhz"] Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.307296 4921 scope.go:117] "RemoveContainer" containerID="fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.331014 4921 scope.go:117] "RemoveContainer" containerID="1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.393044 4921 scope.go:117] "RemoveContainer" containerID="9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1" Mar 18 14:18:25 crc kubenswrapper[4921]: E0318 14:18:25.393522 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1\": container with ID starting with 9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1 not found: ID does not exist" containerID="9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.393555 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1"} err="failed to get container status \"9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1\": rpc error: code = NotFound desc = could not find container \"9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1\": container with ID starting with 9129277bd894aed85c8438dff3a27f2e7d6347572afca6552fb015f304a3ddd1 not found: ID does not exist" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.393575 4921 scope.go:117] "RemoveContainer" containerID="fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64" Mar 18 14:18:25 crc kubenswrapper[4921]: E0318 14:18:25.393812 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64\": container with ID starting with fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64 not found: ID does not exist" containerID="fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.393837 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64"} err="failed to get container status \"fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64\": rpc error: code = NotFound desc = could not find container \"fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64\": container with ID starting with fdfa787df9340d486cfb496b86a3a1d3a762febfd385f25699fd58e070bc7a64 not found: ID does not exist" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.393853 4921 scope.go:117] "RemoveContainer" containerID="1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382" Mar 18 14:18:25 crc kubenswrapper[4921]: E0318 14:18:25.394168 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382\": container with ID starting with 1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382 not found: ID does not exist" containerID="1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382" Mar 18 14:18:25 crc kubenswrapper[4921]: I0318 14:18:25.394198 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382"} err="failed to get container status \"1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382\": rpc error: code = NotFound desc = could not find container \"1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382\": container with ID starting with 1ad869b9321838bf6b0061e50ec21383aada5e1103d39c13eaeff48d5a734382 not found: ID does not exist" Mar 18 14:18:27 crc kubenswrapper[4921]: I0318 14:18:27.228648 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" path="/var/lib/kubelet/pods/89adab88-32cb-4a2b-b2a0-ff2cd1831fad/volumes" Mar 18 14:18:28 crc kubenswrapper[4921]: I0318 14:18:28.295593 4921 generic.go:334] "Generic (PLEG): container finished" podID="501d322f-bf37-45bc-8604-0099a6408aac" containerID="eea85f71af2e4e2a3160b253f67ed7445716700f2f6542619c28502e032250ef" exitCode=0 Mar 18 14:18:28 crc kubenswrapper[4921]: I0318 14:18:28.295900 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" event={"ID":"501d322f-bf37-45bc-8604-0099a6408aac","Type":"ContainerDied","Data":"eea85f71af2e4e2a3160b253f67ed7445716700f2f6542619c28502e032250ef"} Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.774488 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.825751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-inventory\") pod \"501d322f-bf37-45bc-8604-0099a6408aac\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.825977 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smf6x\" (UniqueName: \"kubernetes.io/projected/501d322f-bf37-45bc-8604-0099a6408aac-kube-api-access-smf6x\") pod \"501d322f-bf37-45bc-8604-0099a6408aac\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.826082 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ssh-key-openstack-cell1\") pod \"501d322f-bf37-45bc-8604-0099a6408aac\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.826159 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ceph\") pod \"501d322f-bf37-45bc-8604-0099a6408aac\" (UID: \"501d322f-bf37-45bc-8604-0099a6408aac\") " Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.831896 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501d322f-bf37-45bc-8604-0099a6408aac-kube-api-access-smf6x" (OuterVolumeSpecName: "kube-api-access-smf6x") pod "501d322f-bf37-45bc-8604-0099a6408aac" (UID: "501d322f-bf37-45bc-8604-0099a6408aac"). InnerVolumeSpecName "kube-api-access-smf6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.832079 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ceph" (OuterVolumeSpecName: "ceph") pod "501d322f-bf37-45bc-8604-0099a6408aac" (UID: "501d322f-bf37-45bc-8604-0099a6408aac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.861259 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-inventory" (OuterVolumeSpecName: "inventory") pod "501d322f-bf37-45bc-8604-0099a6408aac" (UID: "501d322f-bf37-45bc-8604-0099a6408aac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.861761 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "501d322f-bf37-45bc-8604-0099a6408aac" (UID: "501d322f-bf37-45bc-8604-0099a6408aac"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.929189 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.929249 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.929262 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/501d322f-bf37-45bc-8604-0099a6408aac-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:29 crc kubenswrapper[4921]: I0318 14:18:29.929276 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smf6x\" (UniqueName: \"kubernetes.io/projected/501d322f-bf37-45bc-8604-0099a6408aac-kube-api-access-smf6x\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.313676 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" event={"ID":"501d322f-bf37-45bc-8604-0099a6408aac","Type":"ContainerDied","Data":"970c9fb8ffe4007bad6ee8ced143ab596094b272c08833a6b5cc603ee4fd46a5"} Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.314242 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970c9fb8ffe4007bad6ee8ced143ab596094b272c08833a6b5cc603ee4fd46a5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.313747 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9gzc2" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411228 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-zhzh5"] Mar 18 14:18:30 crc kubenswrapper[4921]: E0318 14:18:30.411644 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="registry-server" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411659 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="registry-server" Mar 18 14:18:30 crc kubenswrapper[4921]: E0318 14:18:30.411670 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="extract-utilities" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411677 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="extract-utilities" Mar 18 14:18:30 crc kubenswrapper[4921]: E0318 14:18:30.411695 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501d322f-bf37-45bc-8604-0099a6408aac" containerName="configure-os-openstack-openstack-cell1" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411702 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="501d322f-bf37-45bc-8604-0099a6408aac" containerName="configure-os-openstack-openstack-cell1" Mar 18 14:18:30 crc kubenswrapper[4921]: E0318 14:18:30.411715 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="extract-content" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411722 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="extract-content" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411920 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="501d322f-bf37-45bc-8604-0099a6408aac" containerName="configure-os-openstack-openstack-cell1" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.411937 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="89adab88-32cb-4a2b-b2a0-ff2cd1831fad" containerName="registry-server" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.412640 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.415263 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.415651 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.416281 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.416779 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.438699 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-zhzh5"] Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.440719 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.440805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgc2r\" (UniqueName: \"kubernetes.io/projected/69262610-6712-4921-8a8f-43713d54e987-kube-api-access-rgc2r\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.440945 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ceph\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.441023 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-inventory-0\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.542861 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-inventory-0\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.543007 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.543057 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgc2r\" (UniqueName: \"kubernetes.io/projected/69262610-6712-4921-8a8f-43713d54e987-kube-api-access-rgc2r\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.543082 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ceph\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.547617 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ceph\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.548186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-inventory-0\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.551311 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.588922 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgc2r\" (UniqueName: \"kubernetes.io/projected/69262610-6712-4921-8a8f-43713d54e987-kube-api-access-rgc2r\") pod \"ssh-known-hosts-openstack-zhzh5\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:30 crc kubenswrapper[4921]: I0318 14:18:30.738293 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:31 crc kubenswrapper[4921]: I0318 14:18:31.337512 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-zhzh5"] Mar 18 14:18:31 crc kubenswrapper[4921]: W0318 14:18:31.355296 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69262610_6712_4921_8a8f_43713d54e987.slice/crio-50bcf38558c2a59a4da6b006257ca9db07dcfaad03288c593bcdaf11042ca995 WatchSource:0}: Error finding container 50bcf38558c2a59a4da6b006257ca9db07dcfaad03288c593bcdaf11042ca995: Status 404 returned error can't find the container with id 50bcf38558c2a59a4da6b006257ca9db07dcfaad03288c593bcdaf11042ca995 Mar 18 14:18:32 crc kubenswrapper[4921]: I0318 14:18:32.335713 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zhzh5" event={"ID":"69262610-6712-4921-8a8f-43713d54e987","Type":"ContainerStarted","Data":"31e00bdb8304329bf5868fa2fa8e0b78bd6f86b91dc489815523a053ec42798a"} Mar 18 14:18:32 crc kubenswrapper[4921]: I0318 14:18:32.336076 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zhzh5" event={"ID":"69262610-6712-4921-8a8f-43713d54e987","Type":"ContainerStarted","Data":"50bcf38558c2a59a4da6b006257ca9db07dcfaad03288c593bcdaf11042ca995"} Mar 18 14:18:32 crc kubenswrapper[4921]: I0318 14:18:32.353516 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-zhzh5" podStartSLOduration=2.172632998 podStartE2EDuration="2.353494023s" podCreationTimestamp="2026-03-18 14:18:30 +0000 UTC" firstStartedPulling="2026-03-18 14:18:31.357160606 +0000 UTC m=+7730.907081245" lastFinishedPulling="2026-03-18 14:18:31.538021611 +0000 UTC m=+7731.087942270" observedRunningTime="2026-03-18 14:18:32.348050438 +0000 UTC m=+7731.897971087" watchObservedRunningTime="2026-03-18 14:18:32.353494023 +0000 UTC m=+7731.903414662" Mar 18 14:18:40 crc kubenswrapper[4921]: I0318 14:18:40.421134 4921 generic.go:334] "Generic (PLEG): container finished" podID="69262610-6712-4921-8a8f-43713d54e987" containerID="31e00bdb8304329bf5868fa2fa8e0b78bd6f86b91dc489815523a053ec42798a" exitCode=0 Mar 18 14:18:40 crc kubenswrapper[4921]: I0318 14:18:40.421218 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zhzh5" event={"ID":"69262610-6712-4921-8a8f-43713d54e987","Type":"ContainerDied","Data":"31e00bdb8304329bf5868fa2fa8e0b78bd6f86b91dc489815523a053ec42798a"} Mar 18 14:18:41 crc kubenswrapper[4921]: I0318 14:18:41.920945 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.005040 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-inventory-0\") pod \"69262610-6712-4921-8a8f-43713d54e987\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.005330 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ceph\") pod \"69262610-6712-4921-8a8f-43713d54e987\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.005581 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgc2r\" (UniqueName: \"kubernetes.io/projected/69262610-6712-4921-8a8f-43713d54e987-kube-api-access-rgc2r\") pod \"69262610-6712-4921-8a8f-43713d54e987\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.005644 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ssh-key-openstack-cell1\") pod \"69262610-6712-4921-8a8f-43713d54e987\" (UID: \"69262610-6712-4921-8a8f-43713d54e987\") " Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.020350 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69262610-6712-4921-8a8f-43713d54e987-kube-api-access-rgc2r" (OuterVolumeSpecName: "kube-api-access-rgc2r") pod "69262610-6712-4921-8a8f-43713d54e987" (UID: "69262610-6712-4921-8a8f-43713d54e987"). InnerVolumeSpecName "kube-api-access-rgc2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.032891 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ceph" (OuterVolumeSpecName: "ceph") pod "69262610-6712-4921-8a8f-43713d54e987" (UID: "69262610-6712-4921-8a8f-43713d54e987"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.064932 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "69262610-6712-4921-8a8f-43713d54e987" (UID: "69262610-6712-4921-8a8f-43713d54e987"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.089303 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "69262610-6712-4921-8a8f-43713d54e987" (UID: "69262610-6712-4921-8a8f-43713d54e987"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.108601 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.108651 4921 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.108665 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69262610-6712-4921-8a8f-43713d54e987-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.108676 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgc2r\" (UniqueName: \"kubernetes.io/projected/69262610-6712-4921-8a8f-43713d54e987-kube-api-access-rgc2r\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.442574 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zhzh5" event={"ID":"69262610-6712-4921-8a8f-43713d54e987","Type":"ContainerDied","Data":"50bcf38558c2a59a4da6b006257ca9db07dcfaad03288c593bcdaf11042ca995"} Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.442628 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50bcf38558c2a59a4da6b006257ca9db07dcfaad03288c593bcdaf11042ca995" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.442675 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zhzh5" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.527169 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-lv6w4"] Mar 18 14:18:42 crc kubenswrapper[4921]: E0318 14:18:42.528472 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69262610-6712-4921-8a8f-43713d54e987" containerName="ssh-known-hosts-openstack" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.528515 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="69262610-6712-4921-8a8f-43713d54e987" containerName="ssh-known-hosts-openstack" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.528782 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="69262610-6712-4921-8a8f-43713d54e987" containerName="ssh-known-hosts-openstack" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.529652 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.531904 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.532101 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.532491 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.532695 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.539250 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-lv6w4"] Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.619696 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7wf\" (UniqueName: \"kubernetes.io/projected/0478480b-8d25-41f1-befb-14dfde857b39-kube-api-access-7t7wf\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.619772 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-inventory\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.619870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ceph\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.619903 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.721863 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.722132 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7wf\" (UniqueName: \"kubernetes.io/projected/0478480b-8d25-41f1-befb-14dfde857b39-kube-api-access-7t7wf\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.722200 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-inventory\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.722306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ceph\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.726894 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ceph\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.727150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.727014 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-inventory\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.737813 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7wf\" (UniqueName: \"kubernetes.io/projected/0478480b-8d25-41f1-befb-14dfde857b39-kube-api-access-7t7wf\") pod \"run-os-openstack-openstack-cell1-lv6w4\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:42 crc kubenswrapper[4921]: I0318 14:18:42.848986 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:43 crc kubenswrapper[4921]: W0318 14:18:43.405102 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0478480b_8d25_41f1_befb_14dfde857b39.slice/crio-410eb4e242d13fac1b43290de0afee97e89d3ac9ff27e50114d2dd97a21617ec WatchSource:0}: Error finding container 410eb4e242d13fac1b43290de0afee97e89d3ac9ff27e50114d2dd97a21617ec: Status 404 returned error can't find the container with id 410eb4e242d13fac1b43290de0afee97e89d3ac9ff27e50114d2dd97a21617ec Mar 18 14:18:43 crc kubenswrapper[4921]: I0318 14:18:43.409745 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-lv6w4"] Mar 18 14:18:43 crc kubenswrapper[4921]: I0318 14:18:43.451557 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" event={"ID":"0478480b-8d25-41f1-befb-14dfde857b39","Type":"ContainerStarted","Data":"410eb4e242d13fac1b43290de0afee97e89d3ac9ff27e50114d2dd97a21617ec"} Mar 18 14:18:44 crc kubenswrapper[4921]: I0318 14:18:44.460668 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" event={"ID":"0478480b-8d25-41f1-befb-14dfde857b39","Type":"ContainerStarted","Data":"c6b5054f3e3eaf339e057d8251f1736c048e4fa920494a0bcba9ccf05abf8471"} Mar 18 14:18:44 crc kubenswrapper[4921]: I0318 14:18:44.502432 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" podStartSLOduration=2.298769617 podStartE2EDuration="2.50240866s" podCreationTimestamp="2026-03-18 14:18:42 +0000 UTC" firstStartedPulling="2026-03-18 14:18:43.406821479 +0000 UTC m=+7742.956742118" lastFinishedPulling="2026-03-18 14:18:43.610460522 +0000 UTC m=+7743.160381161" observedRunningTime="2026-03-18 14:18:44.497054188 +0000 UTC m=+7744.046974837" watchObservedRunningTime="2026-03-18 14:18:44.50240866 +0000 UTC m=+7744.052329299" Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.081211 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.081905 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.081969 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.083076 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.083209 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" gracePeriod=600 Mar 18 14:18:47 crc kubenswrapper[4921]: E0318 14:18:47.219610 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.491955 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" exitCode=0 Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.492006 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88"} Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.492081 4921 scope.go:117] "RemoveContainer" containerID="765723c54e70151d9ddbc2c17504b39000a3457302886d7aa69d6270fb6acb47" Mar 18 14:18:47 crc kubenswrapper[4921]: I0318 14:18:47.493104 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:18:47 crc kubenswrapper[4921]: E0318 14:18:47.493756 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:18:52 crc kubenswrapper[4921]: I0318 14:18:52.553400 4921 generic.go:334] "Generic (PLEG): container finished" podID="0478480b-8d25-41f1-befb-14dfde857b39" containerID="c6b5054f3e3eaf339e057d8251f1736c048e4fa920494a0bcba9ccf05abf8471" exitCode=0 Mar 18 14:18:52 crc kubenswrapper[4921]: I0318 14:18:52.553473 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" event={"ID":"0478480b-8d25-41f1-befb-14dfde857b39","Type":"ContainerDied","Data":"c6b5054f3e3eaf339e057d8251f1736c048e4fa920494a0bcba9ccf05abf8471"} Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.013563 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.061133 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t7wf\" (UniqueName: \"kubernetes.io/projected/0478480b-8d25-41f1-befb-14dfde857b39-kube-api-access-7t7wf\") pod \"0478480b-8d25-41f1-befb-14dfde857b39\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.061188 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ceph\") pod \"0478480b-8d25-41f1-befb-14dfde857b39\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.066979 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0478480b-8d25-41f1-befb-14dfde857b39-kube-api-access-7t7wf" (OuterVolumeSpecName: "kube-api-access-7t7wf") pod "0478480b-8d25-41f1-befb-14dfde857b39" (UID: "0478480b-8d25-41f1-befb-14dfde857b39"). InnerVolumeSpecName "kube-api-access-7t7wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.071578 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ceph" (OuterVolumeSpecName: "ceph") pod "0478480b-8d25-41f1-befb-14dfde857b39" (UID: "0478480b-8d25-41f1-befb-14dfde857b39"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.162649 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-inventory\") pod \"0478480b-8d25-41f1-befb-14dfde857b39\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.162967 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ssh-key-openstack-cell1\") pod \"0478480b-8d25-41f1-befb-14dfde857b39\" (UID: \"0478480b-8d25-41f1-befb-14dfde857b39\") " Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.163326 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t7wf\" (UniqueName: \"kubernetes.io/projected/0478480b-8d25-41f1-befb-14dfde857b39-kube-api-access-7t7wf\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.163348 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.189508 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-inventory" (OuterVolumeSpecName: "inventory") pod "0478480b-8d25-41f1-befb-14dfde857b39" (UID: "0478480b-8d25-41f1-befb-14dfde857b39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.200494 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0478480b-8d25-41f1-befb-14dfde857b39" (UID: "0478480b-8d25-41f1-befb-14dfde857b39"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.265447 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.265702 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0478480b-8d25-41f1-befb-14dfde857b39-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.575549 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" event={"ID":"0478480b-8d25-41f1-befb-14dfde857b39","Type":"ContainerDied","Data":"410eb4e242d13fac1b43290de0afee97e89d3ac9ff27e50114d2dd97a21617ec"} Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.575824 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="410eb4e242d13fac1b43290de0afee97e89d3ac9ff27e50114d2dd97a21617ec" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.575639 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lv6w4" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.679294 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-wwg7b"] Mar 18 14:18:54 crc kubenswrapper[4921]: E0318 14:18:54.679773 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0478480b-8d25-41f1-befb-14dfde857b39" containerName="run-os-openstack-openstack-cell1" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.679793 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0478480b-8d25-41f1-befb-14dfde857b39" containerName="run-os-openstack-openstack-cell1" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.679982 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0478480b-8d25-41f1-befb-14dfde857b39" containerName="run-os-openstack-openstack-cell1" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.680723 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.684770 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.684934 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.685338 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.689237 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.697179 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-wwg7b"] Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.775337 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ceph\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.775381 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.775414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-inventory\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.775922 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzcls\" (UniqueName: \"kubernetes.io/projected/f8643aad-c6c3-42fb-85be-8227073e73c1-kube-api-access-qzcls\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.877133 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzcls\" (UniqueName: \"kubernetes.io/projected/f8643aad-c6c3-42fb-85be-8227073e73c1-kube-api-access-qzcls\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.877218 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ceph\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.877242 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.877269 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-inventory\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.883611 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-inventory\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.883700 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.884164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ceph\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:54 crc kubenswrapper[4921]: I0318 14:18:54.910673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzcls\" (UniqueName: \"kubernetes.io/projected/f8643aad-c6c3-42fb-85be-8227073e73c1-kube-api-access-qzcls\") pod \"reboot-os-openstack-openstack-cell1-wwg7b\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:55 crc kubenswrapper[4921]: I0318 14:18:55.004129 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:18:55 crc kubenswrapper[4921]: I0318 14:18:55.548089 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-wwg7b"] Mar 18 14:18:55 crc kubenswrapper[4921]: W0318 14:18:55.555739 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8643aad_c6c3_42fb_85be_8227073e73c1.slice/crio-6b73a2eb90141775ece821c38a0839746c0cdb15da14aa6461afd275da003369 WatchSource:0}: Error finding container 6b73a2eb90141775ece821c38a0839746c0cdb15da14aa6461afd275da003369: Status 404 returned error can't find the container with id 6b73a2eb90141775ece821c38a0839746c0cdb15da14aa6461afd275da003369 Mar 18 14:18:55 crc kubenswrapper[4921]: I0318 14:18:55.586787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" event={"ID":"f8643aad-c6c3-42fb-85be-8227073e73c1","Type":"ContainerStarted","Data":"6b73a2eb90141775ece821c38a0839746c0cdb15da14aa6461afd275da003369"} Mar 18 14:18:56 crc kubenswrapper[4921]: I0318 14:18:56.602415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" event={"ID":"f8643aad-c6c3-42fb-85be-8227073e73c1","Type":"ContainerStarted","Data":"b693396e7bf33488ef3cec810202579a853bc1bff65c09992a7f37ba48fac503"} Mar 18 14:18:56 crc kubenswrapper[4921]: I0318 14:18:56.621819 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" podStartSLOduration=2.452433836 podStartE2EDuration="2.621797302s" podCreationTimestamp="2026-03-18 14:18:54 +0000 UTC" firstStartedPulling="2026-03-18 14:18:55.560015296 +0000 UTC m=+7755.109935935" lastFinishedPulling="2026-03-18 14:18:55.729378762 +0000 UTC m=+7755.279299401" observedRunningTime="2026-03-18 14:18:56.616479731 +0000 UTC m=+7756.166400370" watchObservedRunningTime="2026-03-18 14:18:56.621797302 +0000 UTC m=+7756.171717941" Mar 18 14:19:02 crc kubenswrapper[4921]: I0318 14:19:02.210573 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:19:02 crc kubenswrapper[4921]: E0318 14:19:02.211625 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:19:12 crc kubenswrapper[4921]: I0318 14:19:12.779723 4921 generic.go:334] "Generic (PLEG): container finished" podID="f8643aad-c6c3-42fb-85be-8227073e73c1" containerID="b693396e7bf33488ef3cec810202579a853bc1bff65c09992a7f37ba48fac503" exitCode=0 Mar 18 14:19:12 crc kubenswrapper[4921]: I0318 14:19:12.779777 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" event={"ID":"f8643aad-c6c3-42fb-85be-8227073e73c1","Type":"ContainerDied","Data":"b693396e7bf33488ef3cec810202579a853bc1bff65c09992a7f37ba48fac503"} Mar 18 14:19:13 crc kubenswrapper[4921]: I0318 14:19:13.211236 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:19:13 crc kubenswrapper[4921]: E0318 14:19:13.211689 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.344008 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.504607 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ceph\") pod \"f8643aad-c6c3-42fb-85be-8227073e73c1\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.504671 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-inventory\") pod \"f8643aad-c6c3-42fb-85be-8227073e73c1\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.504725 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzcls\" (UniqueName: \"kubernetes.io/projected/f8643aad-c6c3-42fb-85be-8227073e73c1-kube-api-access-qzcls\") pod \"f8643aad-c6c3-42fb-85be-8227073e73c1\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.504818 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ssh-key-openstack-cell1\") pod \"f8643aad-c6c3-42fb-85be-8227073e73c1\" (UID: \"f8643aad-c6c3-42fb-85be-8227073e73c1\") " Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.522435 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8643aad-c6c3-42fb-85be-8227073e73c1-kube-api-access-qzcls" (OuterVolumeSpecName: "kube-api-access-qzcls") pod "f8643aad-c6c3-42fb-85be-8227073e73c1" (UID: "f8643aad-c6c3-42fb-85be-8227073e73c1"). InnerVolumeSpecName "kube-api-access-qzcls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.549374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ceph" (OuterVolumeSpecName: "ceph") pod "f8643aad-c6c3-42fb-85be-8227073e73c1" (UID: "f8643aad-c6c3-42fb-85be-8227073e73c1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.570876 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f8643aad-c6c3-42fb-85be-8227073e73c1" (UID: "f8643aad-c6c3-42fb-85be-8227073e73c1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.578562 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-inventory" (OuterVolumeSpecName: "inventory") pod "f8643aad-c6c3-42fb-85be-8227073e73c1" (UID: "f8643aad-c6c3-42fb-85be-8227073e73c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.606921 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.606961 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.606972 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8643aad-c6c3-42fb-85be-8227073e73c1-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.606981 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzcls\" (UniqueName: \"kubernetes.io/projected/f8643aad-c6c3-42fb-85be-8227073e73c1-kube-api-access-qzcls\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.798065 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" event={"ID":"f8643aad-c6c3-42fb-85be-8227073e73c1","Type":"ContainerDied","Data":"6b73a2eb90141775ece821c38a0839746c0cdb15da14aa6461afd275da003369"} Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.798127 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b73a2eb90141775ece821c38a0839746c0cdb15da14aa6461afd275da003369" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.798167 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-wwg7b" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.881637 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-82j2d"] Mar 18 14:19:14 crc kubenswrapper[4921]: E0318 14:19:14.882042 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8643aad-c6c3-42fb-85be-8227073e73c1" containerName="reboot-os-openstack-openstack-cell1" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.882058 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8643aad-c6c3-42fb-85be-8227073e73c1" containerName="reboot-os-openstack-openstack-cell1" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.882293 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8643aad-c6c3-42fb-85be-8227073e73c1" containerName="reboot-os-openstack-openstack-cell1" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.882970 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.885623 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.885775 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.885791 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.886985 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:19:14 crc kubenswrapper[4921]: I0318 14:19:14.910584 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-82j2d"] Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.014918 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.014963 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.014987 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015021 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ceph\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015095 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbrj\" (UniqueName: \"kubernetes.io/projected/71031cc6-4940-49c7-acba-e58212bcf5f4-kube-api-access-ssbrj\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015147 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-inventory\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015198 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015221 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015280 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.015336 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117322 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-inventory\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117459 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117529 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117557 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117661 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117685 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117707 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117745 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117788 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117857 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ceph\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.117895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbrj\" (UniqueName: \"kubernetes.io/projected/71031cc6-4940-49c7-acba-e58212bcf5f4-kube-api-access-ssbrj\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.122625 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.122669 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.122924 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.123588 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.124065 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-inventory\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.126641 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.126699 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.126932 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.129443 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.130173 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ceph\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.130354 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.144510 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbrj\" (UniqueName: \"kubernetes.io/projected/71031cc6-4940-49c7-acba-e58212bcf5f4-kube-api-access-ssbrj\") pod \"install-certs-openstack-openstack-cell1-82j2d\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.213123 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:15 crc kubenswrapper[4921]: I0318 14:19:15.806853 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-82j2d"] Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.527090 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4sqv"] Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.542584 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4sqv"] Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.542683 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.664008 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-utilities\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.664097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlrkx\" (UniqueName: \"kubernetes.io/projected/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-kube-api-access-jlrkx\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.664329 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-catalog-content\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.767183 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-utilities\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.767652 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlrkx\" (UniqueName: \"kubernetes.io/projected/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-kube-api-access-jlrkx\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.767704 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-catalog-content\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.767715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-utilities\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.767985 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-catalog-content\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.786084 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlrkx\" (UniqueName: \"kubernetes.io/projected/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-kube-api-access-jlrkx\") pod \"redhat-operators-n4sqv\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.819027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" event={"ID":"71031cc6-4940-49c7-acba-e58212bcf5f4","Type":"ContainerStarted","Data":"a11189f25aa16394d0e776c723a7726059da62369012f5de050f516cef2d548a"} Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.819069 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" event={"ID":"71031cc6-4940-49c7-acba-e58212bcf5f4","Type":"ContainerStarted","Data":"ee9ac69f19d3d2eef1431e80c37987a846506fb798ac9996b28feb9275d43318"} Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.844986 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" podStartSLOduration=2.642513419 podStartE2EDuration="2.844969077s" podCreationTimestamp="2026-03-18 14:19:14 +0000 UTC" firstStartedPulling="2026-03-18 14:19:15.808908818 +0000 UTC m=+7775.358829457" lastFinishedPulling="2026-03-18 14:19:16.011364436 +0000 UTC m=+7775.561285115" observedRunningTime="2026-03-18 14:19:16.842588985 +0000 UTC m=+7776.392509614" watchObservedRunningTime="2026-03-18 14:19:16.844969077 +0000 UTC m=+7776.394889706" Mar 18 14:19:16 crc kubenswrapper[4921]: I0318 14:19:16.875786 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:17 crc kubenswrapper[4921]: I0318 14:19:17.388838 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4sqv"] Mar 18 14:19:17 crc kubenswrapper[4921]: I0318 14:19:17.830199 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerID="5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6" exitCode=0 Mar 18 14:19:17 crc kubenswrapper[4921]: I0318 14:19:17.830330 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerDied","Data":"5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6"} Mar 18 14:19:17 crc kubenswrapper[4921]: I0318 14:19:17.830774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerStarted","Data":"6b2abd3a6955ef3c6a72ccd00a63a94973b2c5dd2c4de2285edf819caca52566"} Mar 18 14:19:19 crc kubenswrapper[4921]: I0318 14:19:19.856176 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerStarted","Data":"15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745"} Mar 18 14:19:24 crc kubenswrapper[4921]: I0318 14:19:24.900047 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerID="15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745" exitCode=0 Mar 18 14:19:24 crc kubenswrapper[4921]: I0318 14:19:24.900132 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerDied","Data":"15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745"} Mar 18 14:19:25 crc kubenswrapper[4921]: I0318 14:19:25.209797 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:19:25 crc kubenswrapper[4921]: E0318 14:19:25.210307 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:19:25 crc kubenswrapper[4921]: I0318 14:19:25.932984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerStarted","Data":"73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411"} Mar 18 14:19:25 crc kubenswrapper[4921]: I0318 14:19:25.957039 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4sqv" podStartSLOduration=2.419545486 podStartE2EDuration="9.957018436s" podCreationTimestamp="2026-03-18 14:19:16 +0000 UTC" firstStartedPulling="2026-03-18 14:19:17.832244919 +0000 UTC m=+7777.382165558" lastFinishedPulling="2026-03-18 14:19:25.369717869 +0000 UTC m=+7784.919638508" observedRunningTime="2026-03-18 14:19:25.9548559 +0000 UTC m=+7785.504776539" watchObservedRunningTime="2026-03-18 14:19:25.957018436 +0000 UTC m=+7785.506939075" Mar 18 14:19:26 crc kubenswrapper[4921]: I0318 14:19:26.876056 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:26 crc kubenswrapper[4921]: I0318 14:19:26.876624 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:27 crc kubenswrapper[4921]: I0318 14:19:27.924727 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n4sqv" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="registry-server" probeResult="failure" output=< Mar 18 14:19:27 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 14:19:27 crc kubenswrapper[4921]: > Mar 18 14:19:36 crc kubenswrapper[4921]: I0318 14:19:36.019250 4921 generic.go:334] "Generic (PLEG): container finished" podID="71031cc6-4940-49c7-acba-e58212bcf5f4" containerID="a11189f25aa16394d0e776c723a7726059da62369012f5de050f516cef2d548a" exitCode=0 Mar 18 14:19:36 crc kubenswrapper[4921]: I0318 14:19:36.019334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" event={"ID":"71031cc6-4940-49c7-acba-e58212bcf5f4","Type":"ContainerDied","Data":"a11189f25aa16394d0e776c723a7726059da62369012f5de050f516cef2d548a"} Mar 18 14:19:36 crc kubenswrapper[4921]: I0318 14:19:36.937614 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:36 crc kubenswrapper[4921]: I0318 14:19:36.993316 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.170885 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4sqv"] Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.502487 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.666361 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-inventory\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.666805 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-sriov-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.666991 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-nova-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667170 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbrj\" (UniqueName: \"kubernetes.io/projected/71031cc6-4940-49c7-acba-e58212bcf5f4-kube-api-access-ssbrj\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667292 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ceph\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667385 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ssh-key-openstack-cell1\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667479 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ovn-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667568 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-metadata-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667705 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-dhcp-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667819 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-libvirt-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.667983 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-bootstrap-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.668088 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-telemetry-combined-ca-bundle\") pod \"71031cc6-4940-49c7-acba-e58212bcf5f4\" (UID: \"71031cc6-4940-49c7-acba-e58212bcf5f4\") " Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.676389 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.676364 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71031cc6-4940-49c7-acba-e58212bcf5f4-kube-api-access-ssbrj" (OuterVolumeSpecName: "kube-api-access-ssbrj") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "kube-api-access-ssbrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.676879 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.676899 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.677511 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.677832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ceph" (OuterVolumeSpecName: "ceph") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.680468 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.681654 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.681928 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.682955 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.706602 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-inventory" (OuterVolumeSpecName: "inventory") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.707018 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "71031cc6-4940-49c7-acba-e58212bcf5f4" (UID: "71031cc6-4940-49c7-acba-e58212bcf5f4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770649 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770690 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770706 4921 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770720 4921 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770735 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770747 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770759 4921 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770771 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbrj\" (UniqueName: \"kubernetes.io/projected/71031cc6-4940-49c7-acba-e58212bcf5f4-kube-api-access-ssbrj\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770782 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770795 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770808 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:37 crc kubenswrapper[4921]: I0318 14:19:37.770820 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71031cc6-4940-49c7-acba-e58212bcf5f4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.041406 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" event={"ID":"71031cc6-4940-49c7-acba-e58212bcf5f4","Type":"ContainerDied","Data":"ee9ac69f19d3d2eef1431e80c37987a846506fb798ac9996b28feb9275d43318"} Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.041493 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9ac69f19d3d2eef1431e80c37987a846506fb798ac9996b28feb9275d43318" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.041442 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-82j2d" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.041525 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4sqv" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="registry-server" containerID="cri-o://73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411" gracePeriod=2 Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.139975 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-cq5hz"] Mar 18 14:19:38 crc kubenswrapper[4921]: E0318 14:19:38.140769 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71031cc6-4940-49c7-acba-e58212bcf5f4" containerName="install-certs-openstack-openstack-cell1" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.140786 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="71031cc6-4940-49c7-acba-e58212bcf5f4" containerName="install-certs-openstack-openstack-cell1" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.141006 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="71031cc6-4940-49c7-acba-e58212bcf5f4" containerName="install-certs-openstack-openstack-cell1" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.141821 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.144006 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.144006 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.144327 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.146073 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.153195 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-cq5hz"] Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.209349 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:19:38 crc kubenswrapper[4921]: E0318 14:19:38.209618 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.281367 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.281454 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tvk\" (UniqueName: \"kubernetes.io/projected/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-kube-api-access-v7tvk\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.281730 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ceph\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.281805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-inventory\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.386402 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.386819 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tvk\" (UniqueName: \"kubernetes.io/projected/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-kube-api-access-v7tvk\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.387003 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ceph\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.387094 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-inventory\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.393433 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ceph\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.394026 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-inventory\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.397006 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.406723 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tvk\" (UniqueName: \"kubernetes.io/projected/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-kube-api-access-v7tvk\") pod \"ceph-client-openstack-openstack-cell1-cq5hz\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.521340 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.550931 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.693631 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-utilities\") pod \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.693741 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlrkx\" (UniqueName: \"kubernetes.io/projected/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-kube-api-access-jlrkx\") pod \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.693768 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-catalog-content\") pod \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\" (UID: \"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5\") " Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.694706 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-utilities" (OuterVolumeSpecName: "utilities") pod "a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" (UID: "a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.698661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-kube-api-access-jlrkx" (OuterVolumeSpecName: "kube-api-access-jlrkx") pod "a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" (UID: "a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5"). InnerVolumeSpecName "kube-api-access-jlrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.796874 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.796921 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlrkx\" (UniqueName: \"kubernetes.io/projected/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-kube-api-access-jlrkx\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.849145 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" (UID: "a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:19:38 crc kubenswrapper[4921]: I0318 14:19:38.899192 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.051695 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerID="73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411" exitCode=0 Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.051876 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerDied","Data":"73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411"} Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.051988 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4sqv" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.052006 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4sqv" event={"ID":"a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5","Type":"ContainerDied","Data":"6b2abd3a6955ef3c6a72ccd00a63a94973b2c5dd2c4de2285edf819caca52566"} Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.052030 4921 scope.go:117] "RemoveContainer" containerID="73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.074233 4921 scope.go:117] "RemoveContainer" containerID="15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.104010 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4sqv"] Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.110863 4921 scope.go:117] "RemoveContainer" containerID="5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.114835 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4sqv"] Mar 18 14:19:39 crc kubenswrapper[4921]: W0318 14:19:39.119883 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb062ff1e_3378_47fb_bd4e_95e3ea289ca8.slice/crio-0636160ff02feedda966e3eda16f557b805a9b8eb7fcaaf877365f7d41e70785 WatchSource:0}: Error finding container 0636160ff02feedda966e3eda16f557b805a9b8eb7fcaaf877365f7d41e70785: Status 404 returned error can't find the container with id 0636160ff02feedda966e3eda16f557b805a9b8eb7fcaaf877365f7d41e70785 Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.125855 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-cq5hz"] Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.129120 4921 scope.go:117] "RemoveContainer" containerID="73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411" Mar 18 14:19:39 crc kubenswrapper[4921]: E0318 14:19:39.129518 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411\": container with ID starting with 73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411 not found: ID does not exist" containerID="73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.129557 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411"} err="failed to get container status \"73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411\": rpc error: code = NotFound desc = could not find container \"73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411\": container with ID starting with 73c9d1f33f191ea6761bfb83707aad7b641044dad2ea4ab61d1b443acd6ab411 not found: ID does not exist" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.129578 4921 scope.go:117] "RemoveContainer" containerID="15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745" Mar 18 14:19:39 crc kubenswrapper[4921]: E0318 14:19:39.129886 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745\": container with ID starting with 15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745 not found: ID does not exist" containerID="15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.129919 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745"} err="failed to get container status \"15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745\": rpc error: code = NotFound desc = could not find container \"15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745\": container with ID starting with 15d5c1d025ced3bffaa28d1eb2024e396ab8f4425534f6d67629b970c81d3745 not found: ID does not exist" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.129945 4921 scope.go:117] "RemoveContainer" containerID="5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6" Mar 18 14:19:39 crc kubenswrapper[4921]: E0318 14:19:39.130197 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6\": container with ID starting with 5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6 not found: ID does not exist" containerID="5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.130217 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6"} err="failed to get container status \"5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6\": rpc error: code = NotFound desc = could not find container \"5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6\": container with ID starting with 5c67bac3745766c59a02178e094c7017975d1f8fcea032650c35bf2cd47918d6 not found: ID does not exist" Mar 18 14:19:39 crc kubenswrapper[4921]: I0318 14:19:39.220762 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" path="/var/lib/kubelet/pods/a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5/volumes" Mar 18 14:19:40 crc kubenswrapper[4921]: I0318 14:19:40.067769 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" event={"ID":"b062ff1e-3378-47fb-bd4e-95e3ea289ca8","Type":"ContainerStarted","Data":"3e0d74ed62eafddb0fffe1996b340f3c325863c37988d4598e513b3a7e68687e"} Mar 18 14:19:40 crc kubenswrapper[4921]: I0318 14:19:40.068120 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" event={"ID":"b062ff1e-3378-47fb-bd4e-95e3ea289ca8","Type":"ContainerStarted","Data":"0636160ff02feedda966e3eda16f557b805a9b8eb7fcaaf877365f7d41e70785"} Mar 18 14:19:40 crc kubenswrapper[4921]: I0318 14:19:40.094873 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" podStartSLOduration=1.9322013789999999 podStartE2EDuration="2.094850762s" podCreationTimestamp="2026-03-18 14:19:38 +0000 UTC" firstStartedPulling="2026-03-18 14:19:39.129530474 +0000 UTC m=+7798.679451103" lastFinishedPulling="2026-03-18 14:19:39.292179847 +0000 UTC m=+7798.842100486" observedRunningTime="2026-03-18 14:19:40.092556852 +0000 UTC m=+7799.642477531" watchObservedRunningTime="2026-03-18 14:19:40.094850762 +0000 UTC m=+7799.644771401" Mar 18 14:19:45 crc kubenswrapper[4921]: I0318 14:19:45.119152 4921 generic.go:334] "Generic (PLEG): container finished" podID="b062ff1e-3378-47fb-bd4e-95e3ea289ca8" containerID="3e0d74ed62eafddb0fffe1996b340f3c325863c37988d4598e513b3a7e68687e" exitCode=0 Mar 18 14:19:45 crc kubenswrapper[4921]: I0318 14:19:45.119225 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" event={"ID":"b062ff1e-3378-47fb-bd4e-95e3ea289ca8","Type":"ContainerDied","Data":"3e0d74ed62eafddb0fffe1996b340f3c325863c37988d4598e513b3a7e68687e"} Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.621265 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.767903 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ssh-key-openstack-cell1\") pod \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.768065 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ceph\") pod \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.768268 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7tvk\" (UniqueName: \"kubernetes.io/projected/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-kube-api-access-v7tvk\") pod \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.768372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-inventory\") pod \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\" (UID: \"b062ff1e-3378-47fb-bd4e-95e3ea289ca8\") " Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.773905 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ceph" (OuterVolumeSpecName: "ceph") pod "b062ff1e-3378-47fb-bd4e-95e3ea289ca8" (UID: "b062ff1e-3378-47fb-bd4e-95e3ea289ca8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.778614 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-kube-api-access-v7tvk" (OuterVolumeSpecName: "kube-api-access-v7tvk") pod "b062ff1e-3378-47fb-bd4e-95e3ea289ca8" (UID: "b062ff1e-3378-47fb-bd4e-95e3ea289ca8"). InnerVolumeSpecName "kube-api-access-v7tvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.800924 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-inventory" (OuterVolumeSpecName: "inventory") pod "b062ff1e-3378-47fb-bd4e-95e3ea289ca8" (UID: "b062ff1e-3378-47fb-bd4e-95e3ea289ca8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.803894 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b062ff1e-3378-47fb-bd4e-95e3ea289ca8" (UID: "b062ff1e-3378-47fb-bd4e-95e3ea289ca8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.872079 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.872140 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.872153 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:46 crc kubenswrapper[4921]: I0318 14:19:46.872169 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7tvk\" (UniqueName: \"kubernetes.io/projected/b062ff1e-3378-47fb-bd4e-95e3ea289ca8-kube-api-access-v7tvk\") on node \"crc\" DevicePath \"\"" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.142014 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" event={"ID":"b062ff1e-3378-47fb-bd4e-95e3ea289ca8","Type":"ContainerDied","Data":"0636160ff02feedda966e3eda16f557b805a9b8eb7fcaaf877365f7d41e70785"} Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.142367 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0636160ff02feedda966e3eda16f557b805a9b8eb7fcaaf877365f7d41e70785" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.142263 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-cq5hz" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.250256 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pxqmq"] Mar 18 14:19:47 crc kubenswrapper[4921]: E0318 14:19:47.250847 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b062ff1e-3378-47fb-bd4e-95e3ea289ca8" containerName="ceph-client-openstack-openstack-cell1" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.250874 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b062ff1e-3378-47fb-bd4e-95e3ea289ca8" containerName="ceph-client-openstack-openstack-cell1" Mar 18 14:19:47 crc kubenswrapper[4921]: E0318 14:19:47.250907 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="extract-utilities" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.250917 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="extract-utilities" Mar 18 14:19:47 crc kubenswrapper[4921]: E0318 14:19:47.250953 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="registry-server" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.250960 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="registry-server" Mar 18 14:19:47 crc kubenswrapper[4921]: E0318 14:19:47.250977 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="extract-content" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.250986 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="extract-content" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.251278 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d4ca99-7e3e-46e3-9b24-60bcfd8c9cd5" containerName="registry-server" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.251317 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b062ff1e-3378-47fb-bd4e-95e3ea289ca8" containerName="ceph-client-openstack-openstack-cell1" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.252319 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.257374 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.257866 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.258053 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.258807 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.259334 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.261734 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pxqmq"] Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.382695 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.382780 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5sbt\" (UniqueName: \"kubernetes.io/projected/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-kube-api-access-v5sbt\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.383091 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ceph\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.383557 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.383837 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.384003 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-inventory\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.485557 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.485723 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.485828 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-inventory\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.485866 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.485895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5sbt\" (UniqueName: \"kubernetes.io/projected/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-kube-api-access-v5sbt\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.485972 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ceph\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.486727 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.490635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.490730 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.491439 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ceph\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.491773 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-inventory\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.512834 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5sbt\" (UniqueName: \"kubernetes.io/projected/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-kube-api-access-v5sbt\") pod \"ovn-openstack-openstack-cell1-pxqmq\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:47 crc kubenswrapper[4921]: I0318 14:19:47.582336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:19:48 crc kubenswrapper[4921]: I0318 14:19:48.151202 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pxqmq"] Mar 18 14:19:48 crc kubenswrapper[4921]: I0318 14:19:48.154006 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" event={"ID":"fd44c11c-7a69-4da3-ab19-6e4a4a3603df","Type":"ContainerStarted","Data":"fee9fdc14d659b35bce87c1f714779f4a684f359b40b9dd3d853cd6e96bf9589"} Mar 18 14:19:49 crc kubenswrapper[4921]: I0318 14:19:49.167162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" event={"ID":"fd44c11c-7a69-4da3-ab19-6e4a4a3603df","Type":"ContainerStarted","Data":"61a6bf8b1e2043ad0ca62d2ccb6a7bc46ccaea8e1440125c00d1eb1c0b581363"} Mar 18 14:19:49 crc kubenswrapper[4921]: I0318 14:19:49.198746 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" podStartSLOduration=2.017424886 podStartE2EDuration="2.198727394s" podCreationTimestamp="2026-03-18 14:19:47 +0000 UTC" firstStartedPulling="2026-03-18 14:19:48.138115691 +0000 UTC m=+7807.688036330" lastFinishedPulling="2026-03-18 14:19:48.319418199 +0000 UTC m=+7807.869338838" observedRunningTime="2026-03-18 14:19:49.19067659 +0000 UTC m=+7808.740597229" watchObservedRunningTime="2026-03-18 14:19:49.198727394 +0000 UTC m=+7808.748648033" Mar 18 14:19:50 crc kubenswrapper[4921]: I0318 14:19:50.209353 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:19:50 crc kubenswrapper[4921]: E0318 14:19:50.210005 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.154001 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mnf5v"] Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.156353 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.163187 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.163393 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.163523 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.171701 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mnf5v"] Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.258870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f27sw\" (UniqueName: \"kubernetes.io/projected/39e4231b-0993-4baa-afe0-e478aedd32df-kube-api-access-f27sw\") pod \"auto-csr-approver-29564060-mnf5v\" (UID: \"39e4231b-0993-4baa-afe0-e478aedd32df\") " pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.363212 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f27sw\" (UniqueName: \"kubernetes.io/projected/39e4231b-0993-4baa-afe0-e478aedd32df-kube-api-access-f27sw\") pod \"auto-csr-approver-29564060-mnf5v\" (UID: \"39e4231b-0993-4baa-afe0-e478aedd32df\") " pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.386554 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f27sw\" (UniqueName: \"kubernetes.io/projected/39e4231b-0993-4baa-afe0-e478aedd32df-kube-api-access-f27sw\") pod \"auto-csr-approver-29564060-mnf5v\" (UID: \"39e4231b-0993-4baa-afe0-e478aedd32df\") " pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:00 crc kubenswrapper[4921]: I0318 14:20:00.492595 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:01 crc kubenswrapper[4921]: I0318 14:20:01.026556 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mnf5v"] Mar 18 14:20:01 crc kubenswrapper[4921]: I0318 14:20:01.282449 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" event={"ID":"39e4231b-0993-4baa-afe0-e478aedd32df","Type":"ContainerStarted","Data":"dc0c0fafcc24a7aecab0cf7bd4897a93eb189af0dc14d4e80f2fab401faf8c29"} Mar 18 14:20:02 crc kubenswrapper[4921]: I0318 14:20:02.209360 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:20:02 crc kubenswrapper[4921]: E0318 14:20:02.209933 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:20:03 crc kubenswrapper[4921]: I0318 14:20:03.301828 4921 generic.go:334] "Generic (PLEG): container finished" podID="39e4231b-0993-4baa-afe0-e478aedd32df" containerID="8fe77252331b6406173dd0298f48ccdf542efb7f128642b499b12a2de1c9a1c2" exitCode=0 Mar 18 14:20:03 crc kubenswrapper[4921]: I0318 14:20:03.302068 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" event={"ID":"39e4231b-0993-4baa-afe0-e478aedd32df","Type":"ContainerDied","Data":"8fe77252331b6406173dd0298f48ccdf542efb7f128642b499b12a2de1c9a1c2"} Mar 18 14:20:04 crc kubenswrapper[4921]: I0318 14:20:04.727937 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:04 crc kubenswrapper[4921]: I0318 14:20:04.850910 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f27sw\" (UniqueName: \"kubernetes.io/projected/39e4231b-0993-4baa-afe0-e478aedd32df-kube-api-access-f27sw\") pod \"39e4231b-0993-4baa-afe0-e478aedd32df\" (UID: \"39e4231b-0993-4baa-afe0-e478aedd32df\") " Mar 18 14:20:04 crc kubenswrapper[4921]: I0318 14:20:04.867497 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e4231b-0993-4baa-afe0-e478aedd32df-kube-api-access-f27sw" (OuterVolumeSpecName: "kube-api-access-f27sw") pod "39e4231b-0993-4baa-afe0-e478aedd32df" (UID: "39e4231b-0993-4baa-afe0-e478aedd32df"). InnerVolumeSpecName "kube-api-access-f27sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:04 crc kubenswrapper[4921]: I0318 14:20:04.954211 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f27sw\" (UniqueName: \"kubernetes.io/projected/39e4231b-0993-4baa-afe0-e478aedd32df-kube-api-access-f27sw\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:05 crc kubenswrapper[4921]: I0318 14:20:05.325310 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" event={"ID":"39e4231b-0993-4baa-afe0-e478aedd32df","Type":"ContainerDied","Data":"dc0c0fafcc24a7aecab0cf7bd4897a93eb189af0dc14d4e80f2fab401faf8c29"} Mar 18 14:20:05 crc kubenswrapper[4921]: I0318 14:20:05.325361 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0c0fafcc24a7aecab0cf7bd4897a93eb189af0dc14d4e80f2fab401faf8c29" Mar 18 14:20:05 crc kubenswrapper[4921]: I0318 14:20:05.325382 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564060-mnf5v" Mar 18 14:20:05 crc kubenswrapper[4921]: I0318 14:20:05.807694 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-x4jlk"] Mar 18 14:20:05 crc kubenswrapper[4921]: I0318 14:20:05.821281 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564054-x4jlk"] Mar 18 14:20:07 crc kubenswrapper[4921]: I0318 14:20:07.222964 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9561c92b-ec12-486a-9135-7bd652668deb" path="/var/lib/kubelet/pods/9561c92b-ec12-486a-9135-7bd652668deb/volumes" Mar 18 14:20:13 crc kubenswrapper[4921]: I0318 14:20:13.209446 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:20:13 crc kubenswrapper[4921]: E0318 14:20:13.210103 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:20:16 crc kubenswrapper[4921]: I0318 14:20:16.682957 4921 scope.go:117] "RemoveContainer" containerID="1a0ee61789ba7c3b295ca5cdc360c689f6ee1ce1941c0bc867434da5c84467bd" Mar 18 14:20:25 crc kubenswrapper[4921]: I0318 14:20:25.211494 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:20:25 crc kubenswrapper[4921]: E0318 14:20:25.213037 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:20:37 crc kubenswrapper[4921]: I0318 14:20:37.209028 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:20:37 crc kubenswrapper[4921]: E0318 14:20:37.209811 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:20:52 crc kubenswrapper[4921]: I0318 14:20:52.209899 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:20:52 crc kubenswrapper[4921]: E0318 14:20:52.211199 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:20:52 crc kubenswrapper[4921]: I0318 14:20:52.810673 4921 generic.go:334] "Generic (PLEG): container finished" podID="fd44c11c-7a69-4da3-ab19-6e4a4a3603df" containerID="61a6bf8b1e2043ad0ca62d2ccb6a7bc46ccaea8e1440125c00d1eb1c0b581363" exitCode=0 Mar 18 14:20:52 crc kubenswrapper[4921]: I0318 14:20:52.810729 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" event={"ID":"fd44c11c-7a69-4da3-ab19-6e4a4a3603df","Type":"ContainerDied","Data":"61a6bf8b1e2043ad0ca62d2ccb6a7bc46ccaea8e1440125c00d1eb1c0b581363"} Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.267295 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.390552 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-inventory\") pod \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.390702 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ssh-key-openstack-cell1\") pod \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.390850 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5sbt\" (UniqueName: \"kubernetes.io/projected/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-kube-api-access-v5sbt\") pod \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.390895 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovn-combined-ca-bundle\") pod \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.390934 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ceph\") pod \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.390973 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovncontroller-config-0\") pod \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\" (UID: \"fd44c11c-7a69-4da3-ab19-6e4a4a3603df\") " Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.397366 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fd44c11c-7a69-4da3-ab19-6e4a4a3603df" (UID: "fd44c11c-7a69-4da3-ab19-6e4a4a3603df"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.398587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ceph" (OuterVolumeSpecName: "ceph") pod "fd44c11c-7a69-4da3-ab19-6e4a4a3603df" (UID: "fd44c11c-7a69-4da3-ab19-6e4a4a3603df"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.400072 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-kube-api-access-v5sbt" (OuterVolumeSpecName: "kube-api-access-v5sbt") pod "fd44c11c-7a69-4da3-ab19-6e4a4a3603df" (UID: "fd44c11c-7a69-4da3-ab19-6e4a4a3603df"). InnerVolumeSpecName "kube-api-access-v5sbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.418832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "fd44c11c-7a69-4da3-ab19-6e4a4a3603df" (UID: "fd44c11c-7a69-4da3-ab19-6e4a4a3603df"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.424474 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-inventory" (OuterVolumeSpecName: "inventory") pod "fd44c11c-7a69-4da3-ab19-6e4a4a3603df" (UID: "fd44c11c-7a69-4da3-ab19-6e4a4a3603df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.426186 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fd44c11c-7a69-4da3-ab19-6e4a4a3603df" (UID: "fd44c11c-7a69-4da3-ab19-6e4a4a3603df"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.493456 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.493491 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.493503 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5sbt\" (UniqueName: \"kubernetes.io/projected/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-kube-api-access-v5sbt\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.493523 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.493531 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.493539 4921 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fd44c11c-7a69-4da3-ab19-6e4a4a3603df-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.842914 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" event={"ID":"fd44c11c-7a69-4da3-ab19-6e4a4a3603df","Type":"ContainerDied","Data":"fee9fdc14d659b35bce87c1f714779f4a684f359b40b9dd3d853cd6e96bf9589"} Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.843298 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee9fdc14d659b35bce87c1f714779f4a684f359b40b9dd3d853cd6e96bf9589" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.842959 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pxqmq" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.924582 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-nf6lh"] Mar 18 14:20:54 crc kubenswrapper[4921]: E0318 14:20:54.925045 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd44c11c-7a69-4da3-ab19-6e4a4a3603df" containerName="ovn-openstack-openstack-cell1" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.925066 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd44c11c-7a69-4da3-ab19-6e4a4a3603df" containerName="ovn-openstack-openstack-cell1" Mar 18 14:20:54 crc kubenswrapper[4921]: E0318 14:20:54.925077 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e4231b-0993-4baa-afe0-e478aedd32df" containerName="oc" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.925082 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e4231b-0993-4baa-afe0-e478aedd32df" containerName="oc" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.925311 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd44c11c-7a69-4da3-ab19-6e4a4a3603df" containerName="ovn-openstack-openstack-cell1" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.925327 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e4231b-0993-4baa-afe0-e478aedd32df" containerName="oc" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.926377 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.931775 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.932045 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.932145 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.932190 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.932319 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.940150 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 14:20:54 crc kubenswrapper[4921]: I0318 14:20:54.966830 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-nf6lh"] Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.003836 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.003926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.003951 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.003985 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzj8n\" (UniqueName: \"kubernetes.io/projected/07566af1-b678-4f13-ae4e-4a1c78a219fd-kube-api-access-hzj8n\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.004004 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.004083 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.004132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.105964 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.106016 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.106075 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzj8n\" (UniqueName: \"kubernetes.io/projected/07566af1-b678-4f13-ae4e-4a1c78a219fd-kube-api-access-hzj8n\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.106104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.106213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.106267 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.106324 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.111696 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.112200 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.112292 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.113090 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.113627 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.118770 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.126902 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzj8n\" (UniqueName: \"kubernetes.io/projected/07566af1-b678-4f13-ae4e-4a1c78a219fd-kube-api-access-hzj8n\") pod \"neutron-metadata-openstack-openstack-cell1-nf6lh\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.255600 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:20:55 crc kubenswrapper[4921]: I0318 14:20:55.871544 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-nf6lh"] Mar 18 14:20:56 crc kubenswrapper[4921]: I0318 14:20:56.871905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" event={"ID":"07566af1-b678-4f13-ae4e-4a1c78a219fd","Type":"ContainerStarted","Data":"169fc6ac9261a6ca9ebbb77e13e60a61e4376b54761b23eb4e9a9c59c8b0d4e1"} Mar 18 14:20:56 crc kubenswrapper[4921]: I0318 14:20:56.872461 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" event={"ID":"07566af1-b678-4f13-ae4e-4a1c78a219fd","Type":"ContainerStarted","Data":"f9d8a70e6a6fbeccf692a7753c476d1d26ded37eee95d02954c11dcd700cab90"} Mar 18 14:20:56 crc kubenswrapper[4921]: I0318 14:20:56.895077 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" podStartSLOduration=2.69290457 podStartE2EDuration="2.895049999s" podCreationTimestamp="2026-03-18 14:20:54 +0000 UTC" firstStartedPulling="2026-03-18 14:20:55.871074725 +0000 UTC m=+7875.420995364" lastFinishedPulling="2026-03-18 14:20:56.073220154 +0000 UTC m=+7875.623140793" observedRunningTime="2026-03-18 14:20:56.892833472 +0000 UTC m=+7876.442754111" watchObservedRunningTime="2026-03-18 14:20:56.895049999 +0000 UTC m=+7876.444970678" Mar 18 14:21:07 crc kubenswrapper[4921]: I0318 14:21:07.209556 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:21:07 crc kubenswrapper[4921]: E0318 14:21:07.210399 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:21:21 crc kubenswrapper[4921]: I0318 14:21:21.216413 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:21:21 crc kubenswrapper[4921]: E0318 14:21:21.217204 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:21:35 crc kubenswrapper[4921]: I0318 14:21:35.209088 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:21:35 crc kubenswrapper[4921]: E0318 14:21:35.209934 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:21:48 crc kubenswrapper[4921]: I0318 14:21:48.352036 4921 generic.go:334] "Generic (PLEG): container finished" podID="07566af1-b678-4f13-ae4e-4a1c78a219fd" containerID="169fc6ac9261a6ca9ebbb77e13e60a61e4376b54761b23eb4e9a9c59c8b0d4e1" exitCode=0 Mar 18 14:21:48 crc kubenswrapper[4921]: I0318 14:21:48.352168 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" event={"ID":"07566af1-b678-4f13-ae4e-4a1c78a219fd","Type":"ContainerDied","Data":"169fc6ac9261a6ca9ebbb77e13e60a61e4376b54761b23eb4e9a9c59c8b0d4e1"} Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.210713 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:21:49 crc kubenswrapper[4921]: E0318 14:21:49.211198 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.815168 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.936951 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.937455 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-nova-metadata-neutron-config-0\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.937546 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-metadata-combined-ca-bundle\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.937630 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ssh-key-openstack-cell1\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.937672 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzj8n\" (UniqueName: \"kubernetes.io/projected/07566af1-b678-4f13-ae4e-4a1c78a219fd-kube-api-access-hzj8n\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.937721 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ceph\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.937773 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-inventory\") pod \"07566af1-b678-4f13-ae4e-4a1c78a219fd\" (UID: \"07566af1-b678-4f13-ae4e-4a1c78a219fd\") " Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.944432 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.944470 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07566af1-b678-4f13-ae4e-4a1c78a219fd-kube-api-access-hzj8n" (OuterVolumeSpecName: "kube-api-access-hzj8n") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "kube-api-access-hzj8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.945643 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ceph" (OuterVolumeSpecName: "ceph") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.975669 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.976199 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.980519 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:49 crc kubenswrapper[4921]: I0318 14:21:49.982853 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-inventory" (OuterVolumeSpecName: "inventory") pod "07566af1-b678-4f13-ae4e-4a1c78a219fd" (UID: "07566af1-b678-4f13-ae4e-4a1c78a219fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039855 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039906 4921 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039920 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039931 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039941 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzj8n\" (UniqueName: \"kubernetes.io/projected/07566af1-b678-4f13-ae4e-4a1c78a219fd-kube-api-access-hzj8n\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039952 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.039962 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07566af1-b678-4f13-ae4e-4a1c78a219fd-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.374522 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" event={"ID":"07566af1-b678-4f13-ae4e-4a1c78a219fd","Type":"ContainerDied","Data":"f9d8a70e6a6fbeccf692a7753c476d1d26ded37eee95d02954c11dcd700cab90"} Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.374562 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d8a70e6a6fbeccf692a7753c476d1d26ded37eee95d02954c11dcd700cab90" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.374566 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-nf6lh" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.471540 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rlkd2"] Mar 18 14:21:50 crc kubenswrapper[4921]: E0318 14:21:50.472155 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07566af1-b678-4f13-ae4e-4a1c78a219fd" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.472176 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="07566af1-b678-4f13-ae4e-4a1c78a219fd" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.472445 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="07566af1-b678-4f13-ae4e-4a1c78a219fd" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.473418 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.476553 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.476993 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.478020 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.489754 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:21:50 crc kubenswrapper[4921]: W0318 14:21:50.490085 4921 reflector.go:561] object-"openstack"/"libvirt-secret": failed to list *v1.Secret: secrets "libvirt-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 18 14:21:50 crc kubenswrapper[4921]: E0318 14:21:50.491644 4921 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"libvirt-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"libvirt-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.508560 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rlkd2"] Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.551065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.551186 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-inventory\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.551225 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.551336 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4fx\" (UniqueName: \"kubernetes.io/projected/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-kube-api-access-km4fx\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.551453 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ceph\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.551539 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.653744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ceph\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.653840 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.653883 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.653943 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-inventory\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.653973 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.654015 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4fx\" (UniqueName: \"kubernetes.io/projected/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-kube-api-access-km4fx\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.658638 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.659049 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-inventory\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.660205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.682130 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ceph\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:50 crc kubenswrapper[4921]: I0318 14:21:50.682811 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4fx\" (UniqueName: \"kubernetes.io/projected/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-kube-api-access-km4fx\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:51 crc kubenswrapper[4921]: I0318 14:21:51.634312 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 14:21:51 crc kubenswrapper[4921]: I0318 14:21:51.649153 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rlkd2\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:51 crc kubenswrapper[4921]: I0318 14:21:51.709282 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:21:52 crc kubenswrapper[4921]: I0318 14:21:52.238976 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rlkd2"] Mar 18 14:21:52 crc kubenswrapper[4921]: I0318 14:21:52.399607 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" event={"ID":"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e","Type":"ContainerStarted","Data":"ba062a9082dbdc83aa26885e54878052b829f0febbb869faf467847b5e5f002c"} Mar 18 14:21:53 crc kubenswrapper[4921]: I0318 14:21:53.416526 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" event={"ID":"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e","Type":"ContainerStarted","Data":"f78d9aac2b4ad541cb2cf121ee90ffe0738fc7bd9eea00b75945c2a62448181d"} Mar 18 14:21:53 crc kubenswrapper[4921]: I0318 14:21:53.453818 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" podStartSLOduration=3.2886441570000002 podStartE2EDuration="3.453787821s" podCreationTimestamp="2026-03-18 14:21:50 +0000 UTC" firstStartedPulling="2026-03-18 14:21:52.243320413 +0000 UTC m=+7931.793241052" lastFinishedPulling="2026-03-18 14:21:52.408464067 +0000 UTC m=+7931.958384716" observedRunningTime="2026-03-18 14:21:53.439696017 +0000 UTC m=+7932.989616656" watchObservedRunningTime="2026-03-18 14:21:53.453787821 +0000 UTC m=+7933.003708470" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.149334 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564062-f7pbb"] Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.151969 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.154384 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.154655 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.156744 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.165410 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-f7pbb"] Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.273671 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcr9\" (UniqueName: \"kubernetes.io/projected/59826938-a646-4eed-a604-f6a116f33f95-kube-api-access-2lcr9\") pod \"auto-csr-approver-29564062-f7pbb\" (UID: \"59826938-a646-4eed-a604-f6a116f33f95\") " pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.376447 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcr9\" (UniqueName: \"kubernetes.io/projected/59826938-a646-4eed-a604-f6a116f33f95-kube-api-access-2lcr9\") pod \"auto-csr-approver-29564062-f7pbb\" (UID: \"59826938-a646-4eed-a604-f6a116f33f95\") " pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.395160 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcr9\" (UniqueName: \"kubernetes.io/projected/59826938-a646-4eed-a604-f6a116f33f95-kube-api-access-2lcr9\") pod \"auto-csr-approver-29564062-f7pbb\" (UID: \"59826938-a646-4eed-a604-f6a116f33f95\") " pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.474066 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:00 crc kubenswrapper[4921]: I0318 14:22:00.985088 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-f7pbb"] Mar 18 14:22:01 crc kubenswrapper[4921]: I0318 14:22:01.217043 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:22:01 crc kubenswrapper[4921]: E0318 14:22:01.217523 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:22:01 crc kubenswrapper[4921]: I0318 14:22:01.496653 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" event={"ID":"59826938-a646-4eed-a604-f6a116f33f95","Type":"ContainerStarted","Data":"b06559dbf051efc086fbb47fb9fccb9c697d0ed11489fd466a89c5890a0b2cd3"} Mar 18 14:22:03 crc kubenswrapper[4921]: I0318 14:22:03.524087 4921 generic.go:334] "Generic (PLEG): container finished" podID="59826938-a646-4eed-a604-f6a116f33f95" containerID="148796a12551b30a55bb694a886214d6b2804952003bf5dde13bea4afc0f7cfe" exitCode=0 Mar 18 14:22:03 crc kubenswrapper[4921]: I0318 14:22:03.524171 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" event={"ID":"59826938-a646-4eed-a604-f6a116f33f95","Type":"ContainerDied","Data":"148796a12551b30a55bb694a886214d6b2804952003bf5dde13bea4afc0f7cfe"} Mar 18 14:22:06 crc kubenswrapper[4921]: I0318 14:22:06.246474 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:06 crc kubenswrapper[4921]: I0318 14:22:06.396016 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lcr9\" (UniqueName: \"kubernetes.io/projected/59826938-a646-4eed-a604-f6a116f33f95-kube-api-access-2lcr9\") pod \"59826938-a646-4eed-a604-f6a116f33f95\" (UID: \"59826938-a646-4eed-a604-f6a116f33f95\") " Mar 18 14:22:06 crc kubenswrapper[4921]: I0318 14:22:06.410041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59826938-a646-4eed-a604-f6a116f33f95-kube-api-access-2lcr9" (OuterVolumeSpecName: "kube-api-access-2lcr9") pod "59826938-a646-4eed-a604-f6a116f33f95" (UID: "59826938-a646-4eed-a604-f6a116f33f95"). InnerVolumeSpecName "kube-api-access-2lcr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:22:06 crc kubenswrapper[4921]: I0318 14:22:06.498558 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lcr9\" (UniqueName: \"kubernetes.io/projected/59826938-a646-4eed-a604-f6a116f33f95-kube-api-access-2lcr9\") on node \"crc\" DevicePath \"\"" Mar 18 14:22:07 crc kubenswrapper[4921]: I0318 14:22:07.155640 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" event={"ID":"59826938-a646-4eed-a604-f6a116f33f95","Type":"ContainerDied","Data":"b06559dbf051efc086fbb47fb9fccb9c697d0ed11489fd466a89c5890a0b2cd3"} Mar 18 14:22:07 crc kubenswrapper[4921]: I0318 14:22:07.155681 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06559dbf051efc086fbb47fb9fccb9c697d0ed11489fd466a89c5890a0b2cd3" Mar 18 14:22:07 crc kubenswrapper[4921]: I0318 14:22:07.155740 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564062-f7pbb" Mar 18 14:22:07 crc kubenswrapper[4921]: I0318 14:22:07.320328 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4pvlq"] Mar 18 14:22:07 crc kubenswrapper[4921]: I0318 14:22:07.330006 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564056-4pvlq"] Mar 18 14:22:09 crc kubenswrapper[4921]: I0318 14:22:09.224739 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffdc395d-7849-4de7-ae87-38e625fc5774" path="/var/lib/kubelet/pods/ffdc395d-7849-4de7-ae87-38e625fc5774/volumes" Mar 18 14:22:12 crc kubenswrapper[4921]: I0318 14:22:12.210351 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:22:12 crc kubenswrapper[4921]: E0318 14:22:12.211232 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:22:16 crc kubenswrapper[4921]: I0318 14:22:16.802137 4921 scope.go:117] "RemoveContainer" containerID="6682d561b6ce4ce7ef70d751e08f812ac3ca9878532fce956be3230c17751412" Mar 18 14:22:25 crc kubenswrapper[4921]: I0318 14:22:25.209101 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:22:25 crc kubenswrapper[4921]: E0318 14:22:25.210099 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:22:39 crc kubenswrapper[4921]: I0318 14:22:39.210395 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:22:39 crc kubenswrapper[4921]: E0318 14:22:39.211088 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:22:54 crc kubenswrapper[4921]: I0318 14:22:54.209813 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:22:54 crc kubenswrapper[4921]: E0318 14:22:54.210836 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:23:05 crc kubenswrapper[4921]: I0318 14:23:05.209007 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:23:05 crc kubenswrapper[4921]: E0318 14:23:05.209905 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:23:16 crc kubenswrapper[4921]: I0318 14:23:16.209542 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:23:16 crc kubenswrapper[4921]: E0318 14:23:16.210331 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:23:29 crc kubenswrapper[4921]: I0318 14:23:29.210087 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:23:29 crc kubenswrapper[4921]: E0318 14:23:29.211019 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:23:44 crc kubenswrapper[4921]: I0318 14:23:44.210741 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:23:44 crc kubenswrapper[4921]: E0318 14:23:44.213357 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:23:58 crc kubenswrapper[4921]: I0318 14:23:58.210261 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:23:59 crc kubenswrapper[4921]: I0318 14:23:59.340872 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"6f0ce8bfe0b110a175403a548ad96d572882eed877c07f03ec511ad314b9ca8f"} Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.156543 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564064-txjf7"] Mar 18 14:24:00 crc kubenswrapper[4921]: E0318 14:24:00.157196 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59826938-a646-4eed-a604-f6a116f33f95" containerName="oc" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.157208 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59826938-a646-4eed-a604-f6a116f33f95" containerName="oc" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.157421 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="59826938-a646-4eed-a604-f6a116f33f95" containerName="oc" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.158282 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.160516 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.160942 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.165863 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.179605 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-txjf7"] Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.294234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xpfv\" (UniqueName: \"kubernetes.io/projected/b53935d6-fe34-43a1-a940-537ecd9e161a-kube-api-access-8xpfv\") pod \"auto-csr-approver-29564064-txjf7\" (UID: \"b53935d6-fe34-43a1-a940-537ecd9e161a\") " pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.397354 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xpfv\" (UniqueName: \"kubernetes.io/projected/b53935d6-fe34-43a1-a940-537ecd9e161a-kube-api-access-8xpfv\") pod \"auto-csr-approver-29564064-txjf7\" (UID: \"b53935d6-fe34-43a1-a940-537ecd9e161a\") " pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.418329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xpfv\" (UniqueName: \"kubernetes.io/projected/b53935d6-fe34-43a1-a940-537ecd9e161a-kube-api-access-8xpfv\") pod \"auto-csr-approver-29564064-txjf7\" (UID: \"b53935d6-fe34-43a1-a940-537ecd9e161a\") " pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:00 crc kubenswrapper[4921]: I0318 14:24:00.481272 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:01 crc kubenswrapper[4921]: I0318 14:24:01.001133 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-txjf7"] Mar 18 14:24:01 crc kubenswrapper[4921]: W0318 14:24:01.008654 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb53935d6_fe34_43a1_a940_537ecd9e161a.slice/crio-b70d58feedc599576065137f2ca179864c6581f8d863ab9a66d169cc11bc0221 WatchSource:0}: Error finding container b70d58feedc599576065137f2ca179864c6581f8d863ab9a66d169cc11bc0221: Status 404 returned error can't find the container with id b70d58feedc599576065137f2ca179864c6581f8d863ab9a66d169cc11bc0221 Mar 18 14:24:01 crc kubenswrapper[4921]: I0318 14:24:01.011761 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:24:01 crc kubenswrapper[4921]: I0318 14:24:01.366424 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-txjf7" event={"ID":"b53935d6-fe34-43a1-a940-537ecd9e161a","Type":"ContainerStarted","Data":"b70d58feedc599576065137f2ca179864c6581f8d863ab9a66d169cc11bc0221"} Mar 18 14:24:02 crc kubenswrapper[4921]: I0318 14:24:02.444719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-txjf7" event={"ID":"b53935d6-fe34-43a1-a940-537ecd9e161a","Type":"ContainerStarted","Data":"1a57329cda21ed5521ebe5cebe023dacf1c068944a541bd5d41fdf9ea9432c82"} Mar 18 14:24:02 crc kubenswrapper[4921]: I0318 14:24:02.488128 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564064-txjf7" podStartSLOduration=1.473123938 podStartE2EDuration="2.488098332s" podCreationTimestamp="2026-03-18 14:24:00 +0000 UTC" firstStartedPulling="2026-03-18 14:24:01.011556787 +0000 UTC m=+8060.561477426" lastFinishedPulling="2026-03-18 14:24:02.026531181 +0000 UTC m=+8061.576451820" observedRunningTime="2026-03-18 14:24:02.476824578 +0000 UTC m=+8062.026745217" watchObservedRunningTime="2026-03-18 14:24:02.488098332 +0000 UTC m=+8062.038018971" Mar 18 14:24:03 crc kubenswrapper[4921]: I0318 14:24:03.455104 4921 generic.go:334] "Generic (PLEG): container finished" podID="b53935d6-fe34-43a1-a940-537ecd9e161a" containerID="1a57329cda21ed5521ebe5cebe023dacf1c068944a541bd5d41fdf9ea9432c82" exitCode=0 Mar 18 14:24:03 crc kubenswrapper[4921]: I0318 14:24:03.455238 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-txjf7" event={"ID":"b53935d6-fe34-43a1-a940-537ecd9e161a","Type":"ContainerDied","Data":"1a57329cda21ed5521ebe5cebe023dacf1c068944a541bd5d41fdf9ea9432c82"} Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.290045 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.401533 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xpfv\" (UniqueName: \"kubernetes.io/projected/b53935d6-fe34-43a1-a940-537ecd9e161a-kube-api-access-8xpfv\") pod \"b53935d6-fe34-43a1-a940-537ecd9e161a\" (UID: \"b53935d6-fe34-43a1-a940-537ecd9e161a\") " Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.410844 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53935d6-fe34-43a1-a940-537ecd9e161a-kube-api-access-8xpfv" (OuterVolumeSpecName: "kube-api-access-8xpfv") pod "b53935d6-fe34-43a1-a940-537ecd9e161a" (UID: "b53935d6-fe34-43a1-a940-537ecd9e161a"). InnerVolumeSpecName "kube-api-access-8xpfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.476195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564064-txjf7" event={"ID":"b53935d6-fe34-43a1-a940-537ecd9e161a","Type":"ContainerDied","Data":"b70d58feedc599576065137f2ca179864c6581f8d863ab9a66d169cc11bc0221"} Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.476235 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b70d58feedc599576065137f2ca179864c6581f8d863ab9a66d169cc11bc0221" Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.476283 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564064-txjf7" Mar 18 14:24:05 crc kubenswrapper[4921]: I0318 14:24:05.504843 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xpfv\" (UniqueName: \"kubernetes.io/projected/b53935d6-fe34-43a1-a940-537ecd9e161a-kube-api-access-8xpfv\") on node \"crc\" DevicePath \"\"" Mar 18 14:24:06 crc kubenswrapper[4921]: I0318 14:24:06.371103 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-ch5hf"] Mar 18 14:24:06 crc kubenswrapper[4921]: I0318 14:24:06.396807 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564058-ch5hf"] Mar 18 14:24:07 crc kubenswrapper[4921]: I0318 14:24:07.223891 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b552589-8f3a-4e6f-84f0-7a6067832f18" path="/var/lib/kubelet/pods/4b552589-8f3a-4e6f-84f0-7a6067832f18/volumes" Mar 18 14:24:16 crc kubenswrapper[4921]: I0318 14:24:16.905855 4921 scope.go:117] "RemoveContainer" containerID="73985e4ba40604e9c0f7b4bef7453aec80d645bd8bd8159e8af6801c87d2486a" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.168034 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564066-kn9nx"] Mar 18 14:26:00 crc kubenswrapper[4921]: E0318 14:26:00.168998 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53935d6-fe34-43a1-a940-537ecd9e161a" containerName="oc" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.169012 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53935d6-fe34-43a1-a940-537ecd9e161a" containerName="oc" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.169276 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53935d6-fe34-43a1-a940-537ecd9e161a" containerName="oc" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.170020 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.176815 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.176830 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.176927 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.180737 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-kn9nx"] Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.331090 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff4xj\" (UniqueName: \"kubernetes.io/projected/84c3a186-dfc4-4c33-9633-5d120a21f61d-kube-api-access-ff4xj\") pod \"auto-csr-approver-29564066-kn9nx\" (UID: \"84c3a186-dfc4-4c33-9633-5d120a21f61d\") " pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.433764 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff4xj\" (UniqueName: \"kubernetes.io/projected/84c3a186-dfc4-4c33-9633-5d120a21f61d-kube-api-access-ff4xj\") pod \"auto-csr-approver-29564066-kn9nx\" (UID: \"84c3a186-dfc4-4c33-9633-5d120a21f61d\") " pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.455568 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff4xj\" (UniqueName: \"kubernetes.io/projected/84c3a186-dfc4-4c33-9633-5d120a21f61d-kube-api-access-ff4xj\") pod \"auto-csr-approver-29564066-kn9nx\" (UID: \"84c3a186-dfc4-4c33-9633-5d120a21f61d\") " pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:00 crc kubenswrapper[4921]: I0318 14:26:00.501477 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:01 crc kubenswrapper[4921]: I0318 14:26:01.698937 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-kn9nx"] Mar 18 14:26:01 crc kubenswrapper[4921]: I0318 14:26:01.727902 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" event={"ID":"84c3a186-dfc4-4c33-9633-5d120a21f61d","Type":"ContainerStarted","Data":"56aab0183c5dfc7d77388edeaef8f58704f0c1e2d1dfcf830840c799c7c3b65b"} Mar 18 14:26:03 crc kubenswrapper[4921]: I0318 14:26:03.745620 4921 generic.go:334] "Generic (PLEG): container finished" podID="84c3a186-dfc4-4c33-9633-5d120a21f61d" containerID="f54c56ed8a73e1967b29f0b523ff696af5479e56ee3afa510e4fe2e747254faa" exitCode=0 Mar 18 14:26:03 crc kubenswrapper[4921]: I0318 14:26:03.745688 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" event={"ID":"84c3a186-dfc4-4c33-9633-5d120a21f61d","Type":"ContainerDied","Data":"f54c56ed8a73e1967b29f0b523ff696af5479e56ee3afa510e4fe2e747254faa"} Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.214633 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.342330 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff4xj\" (UniqueName: \"kubernetes.io/projected/84c3a186-dfc4-4c33-9633-5d120a21f61d-kube-api-access-ff4xj\") pod \"84c3a186-dfc4-4c33-9633-5d120a21f61d\" (UID: \"84c3a186-dfc4-4c33-9633-5d120a21f61d\") " Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.348325 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c3a186-dfc4-4c33-9633-5d120a21f61d-kube-api-access-ff4xj" (OuterVolumeSpecName: "kube-api-access-ff4xj") pod "84c3a186-dfc4-4c33-9633-5d120a21f61d" (UID: "84c3a186-dfc4-4c33-9633-5d120a21f61d"). InnerVolumeSpecName "kube-api-access-ff4xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.445761 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff4xj\" (UniqueName: \"kubernetes.io/projected/84c3a186-dfc4-4c33-9633-5d120a21f61d-kube-api-access-ff4xj\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.768519 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" event={"ID":"84c3a186-dfc4-4c33-9633-5d120a21f61d","Type":"ContainerDied","Data":"56aab0183c5dfc7d77388edeaef8f58704f0c1e2d1dfcf830840c799c7c3b65b"} Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.768563 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56aab0183c5dfc7d77388edeaef8f58704f0c1e2d1dfcf830840c799c7c3b65b" Mar 18 14:26:05 crc kubenswrapper[4921]: I0318 14:26:05.768579 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564066-kn9nx" Mar 18 14:26:06 crc kubenswrapper[4921]: I0318 14:26:06.290530 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mnf5v"] Mar 18 14:26:06 crc kubenswrapper[4921]: I0318 14:26:06.299921 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564060-mnf5v"] Mar 18 14:26:07 crc kubenswrapper[4921]: I0318 14:26:07.224421 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e4231b-0993-4baa-afe0-e478aedd32df" path="/var/lib/kubelet/pods/39e4231b-0993-4baa-afe0-e478aedd32df/volumes" Mar 18 14:26:09 crc kubenswrapper[4921]: I0318 14:26:09.813234 4921 generic.go:334] "Generic (PLEG): container finished" podID="3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" containerID="f78d9aac2b4ad541cb2cf121ee90ffe0738fc7bd9eea00b75945c2a62448181d" exitCode=2 Mar 18 14:26:09 crc kubenswrapper[4921]: I0318 14:26:09.813325 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" event={"ID":"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e","Type":"ContainerDied","Data":"f78d9aac2b4ad541cb2cf121ee90ffe0738fc7bd9eea00b75945c2a62448181d"} Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.249309 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.372081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.372294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-combined-ca-bundle\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.372322 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-secret-0\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.373274 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-inventory\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.373324 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ceph\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.373461 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km4fx\" (UniqueName: \"kubernetes.io/projected/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-kube-api-access-km4fx\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.382666 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.383015 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ceph" (OuterVolumeSpecName: "ceph") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.383178 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-kube-api-access-km4fx" (OuterVolumeSpecName: "kube-api-access-km4fx") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e"). InnerVolumeSpecName "kube-api-access-km4fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.404374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:11 crc kubenswrapper[4921]: E0318 14:26:11.417860 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1 podName:3c9aa9e7-5524-4d45-8b76-55b2e0beb89e nodeName:}" failed. No retries permitted until 2026-03-18 14:26:11.91727159 +0000 UTC m=+8191.467192269 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-cell1" (UniqueName: "kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e") : error deleting /var/lib/kubelet/pods/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e/volume-subpaths: remove /var/lib/kubelet/pods/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e/volume-subpaths: no such file or directory Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.420270 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-inventory" (OuterVolumeSpecName: "inventory") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.476379 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km4fx\" (UniqueName: \"kubernetes.io/projected/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-kube-api-access-km4fx\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.476423 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.476437 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.476447 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.476461 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.842859 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" event={"ID":"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e","Type":"ContainerDied","Data":"ba062a9082dbdc83aa26885e54878052b829f0febbb869faf467847b5e5f002c"} Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.842897 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba062a9082dbdc83aa26885e54878052b829f0febbb869faf467847b5e5f002c" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.842997 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rlkd2" Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.987311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1\") pod \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\" (UID: \"3c9aa9e7-5524-4d45-8b76-55b2e0beb89e\") " Mar 18 14:26:11 crc kubenswrapper[4921]: I0318 14:26:11.992367 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" (UID: "3c9aa9e7-5524-4d45-8b76-55b2e0beb89e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:26:12 crc kubenswrapper[4921]: I0318 14:26:12.091034 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3c9aa9e7-5524-4d45-8b76-55b2e0beb89e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:26:17 crc kubenswrapper[4921]: I0318 14:26:17.020613 4921 scope.go:117] "RemoveContainer" containerID="8fe77252331b6406173dd0298f48ccdf542efb7f128642b499b12a2de1c9a1c2" Mar 18 14:26:17 crc kubenswrapper[4921]: I0318 14:26:17.080970 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:26:17 crc kubenswrapper[4921]: I0318 14:26:17.081027 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.036377 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-dnq6v"] Mar 18 14:26:19 crc kubenswrapper[4921]: E0318 14:26:19.037654 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c3a186-dfc4-4c33-9633-5d120a21f61d" containerName="oc" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.037674 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c3a186-dfc4-4c33-9633-5d120a21f61d" containerName="oc" Mar 18 14:26:19 crc kubenswrapper[4921]: E0318 14:26:19.037721 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.037730 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.038007 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c9aa9e7-5524-4d45-8b76-55b2e0beb89e" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.038043 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c3a186-dfc4-4c33-9633-5d120a21f61d" containerName="oc" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.038962 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.043454 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.043549 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.043683 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.044498 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.044534 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.061545 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-dnq6v"] Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.150589 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.150662 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz5sk\" (UniqueName: \"kubernetes.io/projected/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-kube-api-access-rz5sk\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.150750 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ceph\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.150842 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.150926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-inventory\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.150994 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.253249 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.253414 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.253473 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz5sk\" (UniqueName: \"kubernetes.io/projected/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-kube-api-access-rz5sk\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.253536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ceph\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.253599 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.253667 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-inventory\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.259401 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.259820 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.260270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ceph\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.260327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-inventory\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.260672 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.276502 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz5sk\" (UniqueName: \"kubernetes.io/projected/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-kube-api-access-rz5sk\") pod \"libvirt-openstack-openstack-cell1-dnq6v\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.358378 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.903157 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-dnq6v"] Mar 18 14:26:19 crc kubenswrapper[4921]: I0318 14:26:19.945195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" event={"ID":"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15","Type":"ContainerStarted","Data":"9d49f019df99cdbb12f8b278c2b1f722211f9d063599099863e77ec382fbc10a"} Mar 18 14:26:20 crc kubenswrapper[4921]: I0318 14:26:20.955157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" event={"ID":"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15","Type":"ContainerStarted","Data":"9c8359e5aaf6bd7b70536eaeff8b4cce73af938a9366344d179a91b0d69e32af"} Mar 18 14:26:20 crc kubenswrapper[4921]: I0318 14:26:20.984036 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" podStartSLOduration=1.774403996 podStartE2EDuration="1.984014236s" podCreationTimestamp="2026-03-18 14:26:19 +0000 UTC" firstStartedPulling="2026-03-18 14:26:19.908727334 +0000 UTC m=+8199.458647983" lastFinishedPulling="2026-03-18 14:26:20.118337544 +0000 UTC m=+8199.668258223" observedRunningTime="2026-03-18 14:26:20.979163695 +0000 UTC m=+8200.529084334" watchObservedRunningTime="2026-03-18 14:26:20.984014236 +0000 UTC m=+8200.533934875" Mar 18 14:26:47 crc kubenswrapper[4921]: I0318 14:26:47.081561 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:26:47 crc kubenswrapper[4921]: I0318 14:26:47.082127 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.081399 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.081952 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.082009 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.083174 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f0ce8bfe0b110a175403a548ad96d572882eed877c07f03ec511ad314b9ca8f"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.083234 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://6f0ce8bfe0b110a175403a548ad96d572882eed877c07f03ec511ad314b9ca8f" gracePeriod=600 Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.580206 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="6f0ce8bfe0b110a175403a548ad96d572882eed877c07f03ec511ad314b9ca8f" exitCode=0 Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.580276 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"6f0ce8bfe0b110a175403a548ad96d572882eed877c07f03ec511ad314b9ca8f"} Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.580755 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be"} Mar 18 14:27:17 crc kubenswrapper[4921]: I0318 14:27:17.580845 4921 scope.go:117] "RemoveContainer" containerID="3d4a815347542f64001c2f815d9a57f65b64dab78c3054e251e942f9dd6bab88" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.169152 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564068-kcjv6"] Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.172083 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.174927 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.175305 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.176940 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.186290 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-kcjv6"] Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.301296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrgk\" (UniqueName: \"kubernetes.io/projected/9a19f478-e8f6-4959-916f-722781e821e4-kube-api-access-fzrgk\") pod \"auto-csr-approver-29564068-kcjv6\" (UID: \"9a19f478-e8f6-4959-916f-722781e821e4\") " pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.403755 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrgk\" (UniqueName: \"kubernetes.io/projected/9a19f478-e8f6-4959-916f-722781e821e4-kube-api-access-fzrgk\") pod \"auto-csr-approver-29564068-kcjv6\" (UID: \"9a19f478-e8f6-4959-916f-722781e821e4\") " pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.430932 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrgk\" (UniqueName: \"kubernetes.io/projected/9a19f478-e8f6-4959-916f-722781e821e4-kube-api-access-fzrgk\") pod \"auto-csr-approver-29564068-kcjv6\" (UID: \"9a19f478-e8f6-4959-916f-722781e821e4\") " pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:00 crc kubenswrapper[4921]: I0318 14:28:00.506608 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:01 crc kubenswrapper[4921]: I0318 14:28:01.033877 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-kcjv6"] Mar 18 14:28:01 crc kubenswrapper[4921]: I0318 14:28:01.066151 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" event={"ID":"9a19f478-e8f6-4959-916f-722781e821e4","Type":"ContainerStarted","Data":"b41b80c4531a268d23d2869090f98c58d8c211281919756c82330a26a7f7c13b"} Mar 18 14:28:03 crc kubenswrapper[4921]: I0318 14:28:03.088778 4921 generic.go:334] "Generic (PLEG): container finished" podID="9a19f478-e8f6-4959-916f-722781e821e4" containerID="becab43cd91748e0f45434974fe6c5abef0004134dce8b0f172c23892aa1cde8" exitCode=0 Mar 18 14:28:03 crc kubenswrapper[4921]: I0318 14:28:03.088833 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" event={"ID":"9a19f478-e8f6-4959-916f-722781e821e4","Type":"ContainerDied","Data":"becab43cd91748e0f45434974fe6c5abef0004134dce8b0f172c23892aa1cde8"} Mar 18 14:28:04 crc kubenswrapper[4921]: I0318 14:28:04.476607 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:04 crc kubenswrapper[4921]: I0318 14:28:04.598983 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzrgk\" (UniqueName: \"kubernetes.io/projected/9a19f478-e8f6-4959-916f-722781e821e4-kube-api-access-fzrgk\") pod \"9a19f478-e8f6-4959-916f-722781e821e4\" (UID: \"9a19f478-e8f6-4959-916f-722781e821e4\") " Mar 18 14:28:04 crc kubenswrapper[4921]: I0318 14:28:04.677845 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a19f478-e8f6-4959-916f-722781e821e4-kube-api-access-fzrgk" (OuterVolumeSpecName: "kube-api-access-fzrgk") pod "9a19f478-e8f6-4959-916f-722781e821e4" (UID: "9a19f478-e8f6-4959-916f-722781e821e4"). InnerVolumeSpecName "kube-api-access-fzrgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:28:04 crc kubenswrapper[4921]: I0318 14:28:04.700516 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzrgk\" (UniqueName: \"kubernetes.io/projected/9a19f478-e8f6-4959-916f-722781e821e4-kube-api-access-fzrgk\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:05 crc kubenswrapper[4921]: I0318 14:28:05.108430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" event={"ID":"9a19f478-e8f6-4959-916f-722781e821e4","Type":"ContainerDied","Data":"b41b80c4531a268d23d2869090f98c58d8c211281919756c82330a26a7f7c13b"} Mar 18 14:28:05 crc kubenswrapper[4921]: I0318 14:28:05.108473 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41b80c4531a268d23d2869090f98c58d8c211281919756c82330a26a7f7c13b" Mar 18 14:28:05 crc kubenswrapper[4921]: I0318 14:28:05.108536 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564068-kcjv6" Mar 18 14:28:05 crc kubenswrapper[4921]: I0318 14:28:05.558609 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-f7pbb"] Mar 18 14:28:05 crc kubenswrapper[4921]: I0318 14:28:05.570209 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564062-f7pbb"] Mar 18 14:28:07 crc kubenswrapper[4921]: I0318 14:28:07.222080 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59826938-a646-4eed-a604-f6a116f33f95" path="/var/lib/kubelet/pods/59826938-a646-4eed-a604-f6a116f33f95/volumes" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.453241 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zsvnl"] Mar 18 14:28:08 crc kubenswrapper[4921]: E0318 14:28:08.454182 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19f478-e8f6-4959-916f-722781e821e4" containerName="oc" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.454202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19f478-e8f6-4959-916f-722781e821e4" containerName="oc" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.454622 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a19f478-e8f6-4959-916f-722781e821e4" containerName="oc" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.456600 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.476847 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsvnl"] Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.602363 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/241ab20e-560b-4f21-82f5-f42e95e18fea-kube-api-access-prtbc\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.602506 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-catalog-content\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.602570 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-utilities\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.705093 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/241ab20e-560b-4f21-82f5-f42e95e18fea-kube-api-access-prtbc\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.705232 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-catalog-content\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.705294 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-utilities\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.705755 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-catalog-content\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.705816 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-utilities\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.735039 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/241ab20e-560b-4f21-82f5-f42e95e18fea-kube-api-access-prtbc\") pod \"certified-operators-zsvnl\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:08 crc kubenswrapper[4921]: I0318 14:28:08.785148 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:09 crc kubenswrapper[4921]: I0318 14:28:09.364808 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsvnl"] Mar 18 14:28:10 crc kubenswrapper[4921]: I0318 14:28:10.174772 4921 generic.go:334] "Generic (PLEG): container finished" podID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerID="1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4" exitCode=0 Mar 18 14:28:10 crc kubenswrapper[4921]: I0318 14:28:10.174946 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerDied","Data":"1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4"} Mar 18 14:28:10 crc kubenswrapper[4921]: I0318 14:28:10.175602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerStarted","Data":"5d02b8c3cb230330aaca311b222fab0878507da3c25aa259e2c57cf46c3d0923"} Mar 18 14:28:12 crc kubenswrapper[4921]: I0318 14:28:12.200376 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerStarted","Data":"42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296"} Mar 18 14:28:13 crc kubenswrapper[4921]: I0318 14:28:13.216787 4921 generic.go:334] "Generic (PLEG): container finished" podID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerID="42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296" exitCode=0 Mar 18 14:28:13 crc kubenswrapper[4921]: I0318 14:28:13.222542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerDied","Data":"42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296"} Mar 18 14:28:14 crc kubenswrapper[4921]: I0318 14:28:14.232862 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerStarted","Data":"ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba"} Mar 18 14:28:14 crc kubenswrapper[4921]: I0318 14:28:14.252320 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zsvnl" podStartSLOduration=2.76839917 podStartE2EDuration="6.252299059s" podCreationTimestamp="2026-03-18 14:28:08 +0000 UTC" firstStartedPulling="2026-03-18 14:28:10.17828217 +0000 UTC m=+8309.728202809" lastFinishedPulling="2026-03-18 14:28:13.662182049 +0000 UTC m=+8313.212102698" observedRunningTime="2026-03-18 14:28:14.248427416 +0000 UTC m=+8313.798348065" watchObservedRunningTime="2026-03-18 14:28:14.252299059 +0000 UTC m=+8313.802219688" Mar 18 14:28:17 crc kubenswrapper[4921]: I0318 14:28:17.135830 4921 scope.go:117] "RemoveContainer" containerID="148796a12551b30a55bb694a886214d6b2804952003bf5dde13bea4afc0f7cfe" Mar 18 14:28:18 crc kubenswrapper[4921]: I0318 14:28:18.785848 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:18 crc kubenswrapper[4921]: I0318 14:28:18.786364 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:18 crc kubenswrapper[4921]: I0318 14:28:18.873722 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:19 crc kubenswrapper[4921]: I0318 14:28:19.387315 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:19 crc kubenswrapper[4921]: I0318 14:28:19.456952 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zsvnl"] Mar 18 14:28:21 crc kubenswrapper[4921]: I0318 14:28:21.351768 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zsvnl" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="registry-server" containerID="cri-o://ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba" gracePeriod=2 Mar 18 14:28:21 crc kubenswrapper[4921]: I0318 14:28:21.905755 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.009798 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-catalog-content\") pod \"241ab20e-560b-4f21-82f5-f42e95e18fea\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.009968 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/241ab20e-560b-4f21-82f5-f42e95e18fea-kube-api-access-prtbc\") pod \"241ab20e-560b-4f21-82f5-f42e95e18fea\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.009988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-utilities\") pod \"241ab20e-560b-4f21-82f5-f42e95e18fea\" (UID: \"241ab20e-560b-4f21-82f5-f42e95e18fea\") " Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.011169 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-utilities" (OuterVolumeSpecName: "utilities") pod "241ab20e-560b-4f21-82f5-f42e95e18fea" (UID: "241ab20e-560b-4f21-82f5-f42e95e18fea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.020751 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241ab20e-560b-4f21-82f5-f42e95e18fea-kube-api-access-prtbc" (OuterVolumeSpecName: "kube-api-access-prtbc") pod "241ab20e-560b-4f21-82f5-f42e95e18fea" (UID: "241ab20e-560b-4f21-82f5-f42e95e18fea"). InnerVolumeSpecName "kube-api-access-prtbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.112667 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/241ab20e-560b-4f21-82f5-f42e95e18fea-kube-api-access-prtbc\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.112707 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.333426 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "241ab20e-560b-4f21-82f5-f42e95e18fea" (UID: "241ab20e-560b-4f21-82f5-f42e95e18fea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.366122 4921 generic.go:334] "Generic (PLEG): container finished" podID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerID="ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba" exitCode=0 Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.366176 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerDied","Data":"ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba"} Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.366188 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsvnl" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.366222 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsvnl" event={"ID":"241ab20e-560b-4f21-82f5-f42e95e18fea","Type":"ContainerDied","Data":"5d02b8c3cb230330aaca311b222fab0878507da3c25aa259e2c57cf46c3d0923"} Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.366245 4921 scope.go:117] "RemoveContainer" containerID="ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.394504 4921 scope.go:117] "RemoveContainer" containerID="42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.419794 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241ab20e-560b-4f21-82f5-f42e95e18fea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.433261 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zsvnl"] Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.439583 4921 scope.go:117] "RemoveContainer" containerID="1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.448021 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zsvnl"] Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.479739 4921 scope.go:117] "RemoveContainer" containerID="ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba" Mar 18 14:28:22 crc kubenswrapper[4921]: E0318 14:28:22.480522 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba\": container with ID starting with ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba not found: ID does not exist" containerID="ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.480564 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba"} err="failed to get container status \"ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba\": rpc error: code = NotFound desc = could not find container \"ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba\": container with ID starting with ea15390b8fd57dba57aca0b99eae96356fd9434b7fd8f480ee789abdc2fd83ba not found: ID does not exist" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.480588 4921 scope.go:117] "RemoveContainer" containerID="42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296" Mar 18 14:28:22 crc kubenswrapper[4921]: E0318 14:28:22.481546 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296\": container with ID starting with 42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296 not found: ID does not exist" containerID="42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.481592 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296"} err="failed to get container status \"42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296\": rpc error: code = NotFound desc = could not find container \"42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296\": container with ID starting with 42a8c65a632979166deb38c54a5830367bbe64cb08922bf0528bf9f7d896a296 not found: ID does not exist" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.481623 4921 scope.go:117] "RemoveContainer" containerID="1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4" Mar 18 14:28:22 crc kubenswrapper[4921]: E0318 14:28:22.481859 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4\": container with ID starting with 1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4 not found: ID does not exist" containerID="1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4" Mar 18 14:28:22 crc kubenswrapper[4921]: I0318 14:28:22.481886 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4"} err="failed to get container status \"1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4\": rpc error: code = NotFound desc = could not find container \"1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4\": container with ID starting with 1b38ee815c1bfe6fca614bb0294951abcfa2c678d2dc14854df342f21077add4 not found: ID does not exist" Mar 18 14:28:23 crc kubenswrapper[4921]: I0318 14:28:23.234358 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" path="/var/lib/kubelet/pods/241ab20e-560b-4f21-82f5-f42e95e18fea/volumes" Mar 18 14:28:28 crc kubenswrapper[4921]: I0318 14:28:28.447628 4921 generic.go:334] "Generic (PLEG): container finished" podID="5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" containerID="9c8359e5aaf6bd7b70536eaeff8b4cce73af938a9366344d179a91b0d69e32af" exitCode=2 Mar 18 14:28:28 crc kubenswrapper[4921]: I0318 14:28:28.447722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" event={"ID":"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15","Type":"ContainerDied","Data":"9c8359e5aaf6bd7b70536eaeff8b4cce73af938a9366344d179a91b0d69e32af"} Mar 18 14:28:29 crc kubenswrapper[4921]: I0318 14:28:29.975731 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.042092 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz5sk\" (UniqueName: \"kubernetes.io/projected/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-kube-api-access-rz5sk\") pod \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.042351 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ssh-key-openstack-cell1\") pod \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.042391 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-secret-0\") pod \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.042556 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ceph\") pod \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.042611 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-inventory\") pod \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.042644 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-combined-ca-bundle\") pod \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\" (UID: \"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15\") " Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.050278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" (UID: "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.051261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ceph" (OuterVolumeSpecName: "ceph") pod "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" (UID: "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.056366 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-kube-api-access-rz5sk" (OuterVolumeSpecName: "kube-api-access-rz5sk") pod "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" (UID: "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15"). InnerVolumeSpecName "kube-api-access-rz5sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.087461 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-inventory" (OuterVolumeSpecName: "inventory") pod "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" (UID: "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.088172 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" (UID: "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.095258 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" (UID: "5e5ce1c9-5d40-4815-bab4-c0f2fc073e15"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.145046 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.145079 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.145092 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.145101 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.145150 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.145160 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz5sk\" (UniqueName: \"kubernetes.io/projected/5e5ce1c9-5d40-4815-bab4-c0f2fc073e15-kube-api-access-rz5sk\") on node \"crc\" DevicePath \"\"" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.474923 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" event={"ID":"5e5ce1c9-5d40-4815-bab4-c0f2fc073e15","Type":"ContainerDied","Data":"9d49f019df99cdbb12f8b278c2b1f722211f9d063599099863e77ec382fbc10a"} Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.475398 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d49f019df99cdbb12f8b278c2b1f722211f9d063599099863e77ec382fbc10a" Mar 18 14:28:30 crc kubenswrapper[4921]: I0318 14:28:30.475030 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-dnq6v" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.031720 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-gm8q9"] Mar 18 14:28:47 crc kubenswrapper[4921]: E0318 14:28:47.032635 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="extract-utilities" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.032647 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="extract-utilities" Mar 18 14:28:47 crc kubenswrapper[4921]: E0318 14:28:47.032669 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.032675 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:28:47 crc kubenswrapper[4921]: E0318 14:28:47.032694 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="extract-content" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.032700 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="extract-content" Mar 18 14:28:47 crc kubenswrapper[4921]: E0318 14:28:47.032711 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="registry-server" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.032717 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="registry-server" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.032903 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="241ab20e-560b-4f21-82f5-f42e95e18fea" containerName="registry-server" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.032922 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5ce1c9-5d40-4815-bab4-c0f2fc073e15" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.033651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.035657 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.035919 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.036446 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.037082 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.038211 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.044448 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-gm8q9"] Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.208036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.208487 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ceph\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.209530 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.210268 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-inventory\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.210601 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27whx\" (UniqueName: \"kubernetes.io/projected/88ac5388-d9ec-408e-89d8-49c6c098a33b-kube-api-access-27whx\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.210860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.313948 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27whx\" (UniqueName: \"kubernetes.io/projected/88ac5388-d9ec-408e-89d8-49c6c098a33b-kube-api-access-27whx\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.316767 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.317024 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.317105 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ceph\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.320233 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.320654 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-inventory\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.324365 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.324640 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ceph\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.324648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.328001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-inventory\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.328889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.345199 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27whx\" (UniqueName: \"kubernetes.io/projected/88ac5388-d9ec-408e-89d8-49c6c098a33b-kube-api-access-27whx\") pod \"libvirt-openstack-openstack-cell1-gm8q9\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.349948 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:28:47 crc kubenswrapper[4921]: I0318 14:28:47.979984 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-gm8q9"] Mar 18 14:28:47 crc kubenswrapper[4921]: W0318 14:28:47.985314 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ac5388_d9ec_408e_89d8_49c6c098a33b.slice/crio-7cb9e16753a09ac127257c9e0f39cf57b4cd8d9968471971640bbd02934df98c WatchSource:0}: Error finding container 7cb9e16753a09ac127257c9e0f39cf57b4cd8d9968471971640bbd02934df98c: Status 404 returned error can't find the container with id 7cb9e16753a09ac127257c9e0f39cf57b4cd8d9968471971640bbd02934df98c Mar 18 14:28:48 crc kubenswrapper[4921]: I0318 14:28:48.764131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" event={"ID":"88ac5388-d9ec-408e-89d8-49c6c098a33b","Type":"ContainerStarted","Data":"d6835ce60fc0c7c0a3a459ea2aa9327293f52ba2548d931d57a0dcd3a07d2496"} Mar 18 14:28:48 crc kubenswrapper[4921]: I0318 14:28:48.764179 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" event={"ID":"88ac5388-d9ec-408e-89d8-49c6c098a33b","Type":"ContainerStarted","Data":"7cb9e16753a09ac127257c9e0f39cf57b4cd8d9968471971640bbd02934df98c"} Mar 18 14:28:48 crc kubenswrapper[4921]: I0318 14:28:48.798416 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" podStartSLOduration=1.600699615 podStartE2EDuration="1.798392411s" podCreationTimestamp="2026-03-18 14:28:47 +0000 UTC" firstStartedPulling="2026-03-18 14:28:47.988631099 +0000 UTC m=+8347.538551738" lastFinishedPulling="2026-03-18 14:28:48.186323885 +0000 UTC m=+8347.736244534" observedRunningTime="2026-03-18 14:28:48.787619229 +0000 UTC m=+8348.337539878" watchObservedRunningTime="2026-03-18 14:28:48.798392411 +0000 UTC m=+8348.348313060" Mar 18 14:29:02 crc kubenswrapper[4921]: I0318 14:29:02.899904 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7nr8c"] Mar 18 14:29:02 crc kubenswrapper[4921]: I0318 14:29:02.903072 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:02 crc kubenswrapper[4921]: I0318 14:29:02.912065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-utilities\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:02 crc kubenswrapper[4921]: I0318 14:29:02.912490 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vb5k\" (UniqueName: \"kubernetes.io/projected/f7c2b918-5ac8-498f-9643-3ae958625f0d-kube-api-access-9vb5k\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:02 crc kubenswrapper[4921]: I0318 14:29:02.912855 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-catalog-content\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:02 crc kubenswrapper[4921]: I0318 14:29:02.920011 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nr8c"] Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.015237 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-utilities\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.015283 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vb5k\" (UniqueName: \"kubernetes.io/projected/f7c2b918-5ac8-498f-9643-3ae958625f0d-kube-api-access-9vb5k\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.015352 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-catalog-content\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.015865 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-utilities\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.015912 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-catalog-content\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.032623 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vb5k\" (UniqueName: \"kubernetes.io/projected/f7c2b918-5ac8-498f-9643-3ae958625f0d-kube-api-access-9vb5k\") pod \"community-operators-7nr8c\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.237589 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.774291 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nr8c"] Mar 18 14:29:03 crc kubenswrapper[4921]: W0318 14:29:03.783080 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c2b918_5ac8_498f_9643_3ae958625f0d.slice/crio-a718038a1b278c9cfe7ede926822ec0c8f48afbc2f78a2d4762e782c105ad38f WatchSource:0}: Error finding container a718038a1b278c9cfe7ede926822ec0c8f48afbc2f78a2d4762e782c105ad38f: Status 404 returned error can't find the container with id a718038a1b278c9cfe7ede926822ec0c8f48afbc2f78a2d4762e782c105ad38f Mar 18 14:29:03 crc kubenswrapper[4921]: I0318 14:29:03.951350 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerStarted","Data":"a718038a1b278c9cfe7ede926822ec0c8f48afbc2f78a2d4762e782c105ad38f"} Mar 18 14:29:04 crc kubenswrapper[4921]: I0318 14:29:04.971186 4921 generic.go:334] "Generic (PLEG): container finished" podID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerID="25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8" exitCode=0 Mar 18 14:29:04 crc kubenswrapper[4921]: I0318 14:29:04.971251 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerDied","Data":"25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8"} Mar 18 14:29:04 crc kubenswrapper[4921]: I0318 14:29:04.974739 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:29:05 crc kubenswrapper[4921]: I0318 14:29:05.993759 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerStarted","Data":"124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100"} Mar 18 14:29:07 crc kubenswrapper[4921]: I0318 14:29:07.004196 4921 generic.go:334] "Generic (PLEG): container finished" podID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerID="124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100" exitCode=0 Mar 18 14:29:07 crc kubenswrapper[4921]: I0318 14:29:07.004361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerDied","Data":"124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100"} Mar 18 14:29:08 crc kubenswrapper[4921]: I0318 14:29:08.015693 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerStarted","Data":"242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f"} Mar 18 14:29:08 crc kubenswrapper[4921]: I0318 14:29:08.045798 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7nr8c" podStartSLOduration=3.570520533 podStartE2EDuration="6.04576124s" podCreationTimestamp="2026-03-18 14:29:02 +0000 UTC" firstStartedPulling="2026-03-18 14:29:04.974354148 +0000 UTC m=+8364.524274817" lastFinishedPulling="2026-03-18 14:29:07.449594875 +0000 UTC m=+8366.999515524" observedRunningTime="2026-03-18 14:29:08.032059334 +0000 UTC m=+8367.581979983" watchObservedRunningTime="2026-03-18 14:29:08.04576124 +0000 UTC m=+8367.595681929" Mar 18 14:29:13 crc kubenswrapper[4921]: I0318 14:29:13.239216 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:13 crc kubenswrapper[4921]: I0318 14:29:13.241305 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:13 crc kubenswrapper[4921]: I0318 14:29:13.302742 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:14 crc kubenswrapper[4921]: I0318 14:29:14.162421 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:14 crc kubenswrapper[4921]: I0318 14:29:14.229520 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nr8c"] Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.120864 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7nr8c" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="registry-server" containerID="cri-o://242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f" gracePeriod=2 Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.724058 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.871738 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-utilities\") pod \"f7c2b918-5ac8-498f-9643-3ae958625f0d\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.871807 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-catalog-content\") pod \"f7c2b918-5ac8-498f-9643-3ae958625f0d\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.871895 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vb5k\" (UniqueName: \"kubernetes.io/projected/f7c2b918-5ac8-498f-9643-3ae958625f0d-kube-api-access-9vb5k\") pod \"f7c2b918-5ac8-498f-9643-3ae958625f0d\" (UID: \"f7c2b918-5ac8-498f-9643-3ae958625f0d\") " Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.873095 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-utilities" (OuterVolumeSpecName: "utilities") pod "f7c2b918-5ac8-498f-9643-3ae958625f0d" (UID: "f7c2b918-5ac8-498f-9643-3ae958625f0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.885602 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c2b918-5ac8-498f-9643-3ae958625f0d-kube-api-access-9vb5k" (OuterVolumeSpecName: "kube-api-access-9vb5k") pod "f7c2b918-5ac8-498f-9643-3ae958625f0d" (UID: "f7c2b918-5ac8-498f-9643-3ae958625f0d"). InnerVolumeSpecName "kube-api-access-9vb5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.927384 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7c2b918-5ac8-498f-9643-3ae958625f0d" (UID: "f7c2b918-5ac8-498f-9643-3ae958625f0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.974887 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.974930 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c2b918-5ac8-498f-9643-3ae958625f0d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:16 crc kubenswrapper[4921]: I0318 14:29:16.974948 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vb5k\" (UniqueName: \"kubernetes.io/projected/f7c2b918-5ac8-498f-9643-3ae958625f0d-kube-api-access-9vb5k\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.080821 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.080881 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.132535 4921 generic.go:334] "Generic (PLEG): container finished" podID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerID="242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f" exitCode=0 Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.132593 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerDied","Data":"242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f"} Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.132623 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nr8c" event={"ID":"f7c2b918-5ac8-498f-9643-3ae958625f0d","Type":"ContainerDied","Data":"a718038a1b278c9cfe7ede926822ec0c8f48afbc2f78a2d4762e782c105ad38f"} Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.132641 4921 scope.go:117] "RemoveContainer" containerID="242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.132671 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nr8c" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.163246 4921 scope.go:117] "RemoveContainer" containerID="124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.192264 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nr8c"] Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.196378 4921 scope.go:117] "RemoveContainer" containerID="25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.205003 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7nr8c"] Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.251745 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" path="/var/lib/kubelet/pods/f7c2b918-5ac8-498f-9643-3ae958625f0d/volumes" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.258179 4921 scope.go:117] "RemoveContainer" containerID="242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f" Mar 18 14:29:17 crc kubenswrapper[4921]: E0318 14:29:17.258664 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f\": container with ID starting with 242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f not found: ID does not exist" containerID="242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.258698 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f"} err="failed to get container status \"242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f\": rpc error: code = NotFound desc = could not find container \"242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f\": container with ID starting with 242b0a3acf07c69094a113add37155ca98601ab29ac674ef623ebd025c5efb8f not found: ID does not exist" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.258722 4921 scope.go:117] "RemoveContainer" containerID="124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100" Mar 18 14:29:17 crc kubenswrapper[4921]: E0318 14:29:17.259034 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100\": container with ID starting with 124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100 not found: ID does not exist" containerID="124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.259060 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100"} err="failed to get container status \"124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100\": rpc error: code = NotFound desc = could not find container \"124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100\": container with ID starting with 124b0af67fb9983d1aa3632cb103c4de21628f357bb061cac5f74cd466909100 not found: ID does not exist" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.259076 4921 scope.go:117] "RemoveContainer" containerID="25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8" Mar 18 14:29:17 crc kubenswrapper[4921]: E0318 14:29:17.259352 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8\": container with ID starting with 25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8 not found: ID does not exist" containerID="25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8" Mar 18 14:29:17 crc kubenswrapper[4921]: I0318 14:29:17.259383 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8"} err="failed to get container status \"25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8\": rpc error: code = NotFound desc = could not find container \"25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8\": container with ID starting with 25d4220c2fac07604814a9274bfe137416fa534d4653a0c8cd182da3aa6fbaa8 not found: ID does not exist" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.316476 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vkcg4"] Mar 18 14:29:23 crc kubenswrapper[4921]: E0318 14:29:23.319633 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="extract-content" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.319651 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="extract-content" Mar 18 14:29:23 crc kubenswrapper[4921]: E0318 14:29:23.319676 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="extract-utilities" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.319683 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="extract-utilities" Mar 18 14:29:23 crc kubenswrapper[4921]: E0318 14:29:23.319711 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="registry-server" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.319728 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="registry-server" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.319953 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c2b918-5ac8-498f-9643-3ae958625f0d" containerName="registry-server" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.324095 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.372685 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkcg4"] Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.457600 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flwzz\" (UniqueName: \"kubernetes.io/projected/9ff8db26-dc24-4ddb-8f10-480d42cb3145-kube-api-access-flwzz\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.457685 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-utilities\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.458130 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-catalog-content\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.560750 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-utilities\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.560941 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-catalog-content\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.561065 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flwzz\" (UniqueName: \"kubernetes.io/projected/9ff8db26-dc24-4ddb-8f10-480d42cb3145-kube-api-access-flwzz\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.561499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-utilities\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.561550 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-catalog-content\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.588825 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flwzz\" (UniqueName: \"kubernetes.io/projected/9ff8db26-dc24-4ddb-8f10-480d42cb3145-kube-api-access-flwzz\") pod \"redhat-operators-vkcg4\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:23 crc kubenswrapper[4921]: I0318 14:29:23.674772 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:24 crc kubenswrapper[4921]: I0318 14:29:24.229697 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkcg4"] Mar 18 14:29:24 crc kubenswrapper[4921]: W0318 14:29:24.238636 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff8db26_dc24_4ddb_8f10_480d42cb3145.slice/crio-857aa2ae7da2a3cb09e1768fc8d18e464b2cc9cde658e412eb3f2b4670b90f8f WatchSource:0}: Error finding container 857aa2ae7da2a3cb09e1768fc8d18e464b2cc9cde658e412eb3f2b4670b90f8f: Status 404 returned error can't find the container with id 857aa2ae7da2a3cb09e1768fc8d18e464b2cc9cde658e412eb3f2b4670b90f8f Mar 18 14:29:25 crc kubenswrapper[4921]: I0318 14:29:25.230127 4921 generic.go:334] "Generic (PLEG): container finished" podID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerID="f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb" exitCode=0 Mar 18 14:29:25 crc kubenswrapper[4921]: I0318 14:29:25.230179 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerDied","Data":"f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb"} Mar 18 14:29:25 crc kubenswrapper[4921]: I0318 14:29:25.230661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerStarted","Data":"857aa2ae7da2a3cb09e1768fc8d18e464b2cc9cde658e412eb3f2b4670b90f8f"} Mar 18 14:29:26 crc kubenswrapper[4921]: I0318 14:29:26.244422 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerStarted","Data":"df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012"} Mar 18 14:29:29 crc kubenswrapper[4921]: I0318 14:29:29.283456 4921 generic.go:334] "Generic (PLEG): container finished" podID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerID="df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012" exitCode=0 Mar 18 14:29:29 crc kubenswrapper[4921]: I0318 14:29:29.283538 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerDied","Data":"df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012"} Mar 18 14:29:30 crc kubenswrapper[4921]: I0318 14:29:30.295126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerStarted","Data":"7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e"} Mar 18 14:29:30 crc kubenswrapper[4921]: I0318 14:29:30.342914 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vkcg4" podStartSLOduration=2.859160352 podStartE2EDuration="7.342895096s" podCreationTimestamp="2026-03-18 14:29:23 +0000 UTC" firstStartedPulling="2026-03-18 14:29:25.232557274 +0000 UTC m=+8384.782477913" lastFinishedPulling="2026-03-18 14:29:29.716292018 +0000 UTC m=+8389.266212657" observedRunningTime="2026-03-18 14:29:30.315979946 +0000 UTC m=+8389.865900585" watchObservedRunningTime="2026-03-18 14:29:30.342895096 +0000 UTC m=+8389.892815735" Mar 18 14:29:33 crc kubenswrapper[4921]: I0318 14:29:33.675383 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:33 crc kubenswrapper[4921]: I0318 14:29:33.676062 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:34 crc kubenswrapper[4921]: I0318 14:29:34.732240 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vkcg4" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="registry-server" probeResult="failure" output=< Mar 18 14:29:34 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 18 14:29:34 crc kubenswrapper[4921]: > Mar 18 14:29:43 crc kubenswrapper[4921]: I0318 14:29:43.745733 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:43 crc kubenswrapper[4921]: I0318 14:29:43.834856 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:43 crc kubenswrapper[4921]: I0318 14:29:43.990481 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkcg4"] Mar 18 14:29:45 crc kubenswrapper[4921]: I0318 14:29:45.482730 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vkcg4" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="registry-server" containerID="cri-o://7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e" gracePeriod=2 Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.084791 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.208037 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-catalog-content\") pod \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.208534 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flwzz\" (UniqueName: \"kubernetes.io/projected/9ff8db26-dc24-4ddb-8f10-480d42cb3145-kube-api-access-flwzz\") pod \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.208727 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-utilities\") pod \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\" (UID: \"9ff8db26-dc24-4ddb-8f10-480d42cb3145\") " Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.209422 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-utilities" (OuterVolumeSpecName: "utilities") pod "9ff8db26-dc24-4ddb-8f10-480d42cb3145" (UID: "9ff8db26-dc24-4ddb-8f10-480d42cb3145"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.209983 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.223918 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff8db26-dc24-4ddb-8f10-480d42cb3145-kube-api-access-flwzz" (OuterVolumeSpecName: "kube-api-access-flwzz") pod "9ff8db26-dc24-4ddb-8f10-480d42cb3145" (UID: "9ff8db26-dc24-4ddb-8f10-480d42cb3145"). InnerVolumeSpecName "kube-api-access-flwzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.311785 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flwzz\" (UniqueName: \"kubernetes.io/projected/9ff8db26-dc24-4ddb-8f10-480d42cb3145-kube-api-access-flwzz\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.352747 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ff8db26-dc24-4ddb-8f10-480d42cb3145" (UID: "9ff8db26-dc24-4ddb-8f10-480d42cb3145"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.413715 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ff8db26-dc24-4ddb-8f10-480d42cb3145-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.494047 4921 generic.go:334] "Generic (PLEG): container finished" podID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerID="7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e" exitCode=0 Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.494132 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkcg4" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.494144 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerDied","Data":"7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e"} Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.495290 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkcg4" event={"ID":"9ff8db26-dc24-4ddb-8f10-480d42cb3145","Type":"ContainerDied","Data":"857aa2ae7da2a3cb09e1768fc8d18e464b2cc9cde658e412eb3f2b4670b90f8f"} Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.495317 4921 scope.go:117] "RemoveContainer" containerID="7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.545934 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkcg4"] Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.549504 4921 scope.go:117] "RemoveContainer" containerID="df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.557557 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vkcg4"] Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.576254 4921 scope.go:117] "RemoveContainer" containerID="f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.627691 4921 scope.go:117] "RemoveContainer" containerID="7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e" Mar 18 14:29:46 crc kubenswrapper[4921]: E0318 14:29:46.628050 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e\": container with ID starting with 7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e not found: ID does not exist" containerID="7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.628091 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e"} err="failed to get container status \"7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e\": rpc error: code = NotFound desc = could not find container \"7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e\": container with ID starting with 7e536708a37d1586776c964d228223c62dbe976f0957631dfd22b13517b9842e not found: ID does not exist" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.628131 4921 scope.go:117] "RemoveContainer" containerID="df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012" Mar 18 14:29:46 crc kubenswrapper[4921]: E0318 14:29:46.628535 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012\": container with ID starting with df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012 not found: ID does not exist" containerID="df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.628580 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012"} err="failed to get container status \"df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012\": rpc error: code = NotFound desc = could not find container \"df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012\": container with ID starting with df478f4793e99012347351eca6c3226c24c7a86a7278cc04523a883db369b012 not found: ID does not exist" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.628601 4921 scope.go:117] "RemoveContainer" containerID="f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb" Mar 18 14:29:46 crc kubenswrapper[4921]: E0318 14:29:46.629048 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb\": container with ID starting with f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb not found: ID does not exist" containerID="f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb" Mar 18 14:29:46 crc kubenswrapper[4921]: I0318 14:29:46.629082 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb"} err="failed to get container status \"f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb\": rpc error: code = NotFound desc = could not find container \"f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb\": container with ID starting with f7e94e9d185a1382b8b415ae0ea2ac0c2e290da89655e0ffeed451f7426ec8bb not found: ID does not exist" Mar 18 14:29:47 crc kubenswrapper[4921]: I0318 14:29:47.080953 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:29:47 crc kubenswrapper[4921]: I0318 14:29:47.081436 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:29:47 crc kubenswrapper[4921]: I0318 14:29:47.224487 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" path="/var/lib/kubelet/pods/9ff8db26-dc24-4ddb-8f10-480d42cb3145/volumes" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.199664 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564070-rglgp"] Mar 18 14:30:00 crc kubenswrapper[4921]: E0318 14:30:00.201242 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="extract-utilities" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.201263 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="extract-utilities" Mar 18 14:30:00 crc kubenswrapper[4921]: E0318 14:30:00.201282 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="registry-server" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.201293 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="registry-server" Mar 18 14:30:00 crc kubenswrapper[4921]: E0318 14:30:00.201327 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="extract-content" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.201337 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="extract-content" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.201706 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff8db26-dc24-4ddb-8f10-480d42cb3145" containerName="registry-server" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.202857 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.208677 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.208744 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.209876 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.219224 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7"] Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.222548 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.224256 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.228691 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.238847 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-rglgp"] Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.252651 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7"] Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.371299 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9135fd81-c79e-455b-a614-593664b770f3-config-volume\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.371591 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9135fd81-c79e-455b-a614-593664b770f3-secret-volume\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.371709 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnv4z\" (UniqueName: \"kubernetes.io/projected/d9dcd334-bc90-417e-81fb-39563ba30e42-kube-api-access-dnv4z\") pod \"auto-csr-approver-29564070-rglgp\" (UID: \"d9dcd334-bc90-417e-81fb-39563ba30e42\") " pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.371917 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rvn\" (UniqueName: \"kubernetes.io/projected/9135fd81-c79e-455b-a614-593664b770f3-kube-api-access-77rvn\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.474689 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77rvn\" (UniqueName: \"kubernetes.io/projected/9135fd81-c79e-455b-a614-593664b770f3-kube-api-access-77rvn\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.474868 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9135fd81-c79e-455b-a614-593664b770f3-config-volume\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.474916 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9135fd81-c79e-455b-a614-593664b770f3-secret-volume\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.474974 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnv4z\" (UniqueName: \"kubernetes.io/projected/d9dcd334-bc90-417e-81fb-39563ba30e42-kube-api-access-dnv4z\") pod \"auto-csr-approver-29564070-rglgp\" (UID: \"d9dcd334-bc90-417e-81fb-39563ba30e42\") " pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.477067 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9135fd81-c79e-455b-a614-593664b770f3-config-volume\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.485189 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9135fd81-c79e-455b-a614-593664b770f3-secret-volume\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.500911 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnv4z\" (UniqueName: \"kubernetes.io/projected/d9dcd334-bc90-417e-81fb-39563ba30e42-kube-api-access-dnv4z\") pod \"auto-csr-approver-29564070-rglgp\" (UID: \"d9dcd334-bc90-417e-81fb-39563ba30e42\") " pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.507459 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rvn\" (UniqueName: \"kubernetes.io/projected/9135fd81-c79e-455b-a614-593664b770f3-kube-api-access-77rvn\") pod \"collect-profiles-29564070-6lct7\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.526107 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:00 crc kubenswrapper[4921]: I0318 14:30:00.540352 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:01 crc kubenswrapper[4921]: I0318 14:30:01.057684 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7"] Mar 18 14:30:01 crc kubenswrapper[4921]: I0318 14:30:01.134977 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-rglgp"] Mar 18 14:30:01 crc kubenswrapper[4921]: I0318 14:30:01.727849 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" event={"ID":"9135fd81-c79e-455b-a614-593664b770f3","Type":"ContainerStarted","Data":"e202af5d322e95723acf282214bf80ece1095b92614a5b48ffb391312bb541e5"} Mar 18 14:30:01 crc kubenswrapper[4921]: I0318 14:30:01.729645 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" event={"ID":"9135fd81-c79e-455b-a614-593664b770f3","Type":"ContainerStarted","Data":"83bc7c0192de5bed25b3022fd49726e403b9b7af399147f6b578bfb8ee448e3c"} Mar 18 14:30:01 crc kubenswrapper[4921]: I0318 14:30:01.729827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-rglgp" event={"ID":"d9dcd334-bc90-417e-81fb-39563ba30e42","Type":"ContainerStarted","Data":"0193a2294d1924a73bd6596cf90ca4f9146a445cf697eae84b8ac10ee329ee56"} Mar 18 14:30:02 crc kubenswrapper[4921]: I0318 14:30:02.743715 4921 generic.go:334] "Generic (PLEG): container finished" podID="9135fd81-c79e-455b-a614-593664b770f3" containerID="e202af5d322e95723acf282214bf80ece1095b92614a5b48ffb391312bb541e5" exitCode=0 Mar 18 14:30:02 crc kubenswrapper[4921]: I0318 14:30:02.743814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" event={"ID":"9135fd81-c79e-455b-a614-593664b770f3","Type":"ContainerDied","Data":"e202af5d322e95723acf282214bf80ece1095b92614a5b48ffb391312bb541e5"} Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.208649 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.353987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77rvn\" (UniqueName: \"kubernetes.io/projected/9135fd81-c79e-455b-a614-593664b770f3-kube-api-access-77rvn\") pod \"9135fd81-c79e-455b-a614-593664b770f3\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.354879 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9135fd81-c79e-455b-a614-593664b770f3-config-volume\") pod \"9135fd81-c79e-455b-a614-593664b770f3\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.355222 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9135fd81-c79e-455b-a614-593664b770f3-secret-volume\") pod \"9135fd81-c79e-455b-a614-593664b770f3\" (UID: \"9135fd81-c79e-455b-a614-593664b770f3\") " Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.355966 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9135fd81-c79e-455b-a614-593664b770f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "9135fd81-c79e-455b-a614-593664b770f3" (UID: "9135fd81-c79e-455b-a614-593664b770f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.358084 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9135fd81-c79e-455b-a614-593664b770f3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.361174 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9135fd81-c79e-455b-a614-593664b770f3-kube-api-access-77rvn" (OuterVolumeSpecName: "kube-api-access-77rvn") pod "9135fd81-c79e-455b-a614-593664b770f3" (UID: "9135fd81-c79e-455b-a614-593664b770f3"). InnerVolumeSpecName "kube-api-access-77rvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.365243 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9135fd81-c79e-455b-a614-593664b770f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9135fd81-c79e-455b-a614-593664b770f3" (UID: "9135fd81-c79e-455b-a614-593664b770f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.460314 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9135fd81-c79e-455b-a614-593664b770f3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.460349 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77rvn\" (UniqueName: \"kubernetes.io/projected/9135fd81-c79e-455b-a614-593664b770f3-kube-api-access-77rvn\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.756889 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.756918 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564070-6lct7" event={"ID":"9135fd81-c79e-455b-a614-593664b770f3","Type":"ContainerDied","Data":"83bc7c0192de5bed25b3022fd49726e403b9b7af399147f6b578bfb8ee448e3c"} Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.756999 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83bc7c0192de5bed25b3022fd49726e403b9b7af399147f6b578bfb8ee448e3c" Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.760341 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9dcd334-bc90-417e-81fb-39563ba30e42" containerID="1eca05bc1d55d9f308a2e33c61ca885908597a4f8d63d93a936193576bc0da40" exitCode=0 Mar 18 14:30:03 crc kubenswrapper[4921]: I0318 14:30:03.760381 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-rglgp" event={"ID":"d9dcd334-bc90-417e-81fb-39563ba30e42","Type":"ContainerDied","Data":"1eca05bc1d55d9f308a2e33c61ca885908597a4f8d63d93a936193576bc0da40"} Mar 18 14:30:04 crc kubenswrapper[4921]: I0318 14:30:04.298667 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq"] Mar 18 14:30:04 crc kubenswrapper[4921]: I0318 14:30:04.306678 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-n6vtq"] Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.235638 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dfcce58-9dd7-456d-a101-6b1d5ba04d54" path="/var/lib/kubelet/pods/1dfcce58-9dd7-456d-a101-6b1d5ba04d54/volumes" Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.540849 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.717884 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnv4z\" (UniqueName: \"kubernetes.io/projected/d9dcd334-bc90-417e-81fb-39563ba30e42-kube-api-access-dnv4z\") pod \"d9dcd334-bc90-417e-81fb-39563ba30e42\" (UID: \"d9dcd334-bc90-417e-81fb-39563ba30e42\") " Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.724507 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9dcd334-bc90-417e-81fb-39563ba30e42-kube-api-access-dnv4z" (OuterVolumeSpecName: "kube-api-access-dnv4z") pod "d9dcd334-bc90-417e-81fb-39563ba30e42" (UID: "d9dcd334-bc90-417e-81fb-39563ba30e42"). InnerVolumeSpecName "kube-api-access-dnv4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.790812 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564070-rglgp" event={"ID":"d9dcd334-bc90-417e-81fb-39563ba30e42","Type":"ContainerDied","Data":"0193a2294d1924a73bd6596cf90ca4f9146a445cf697eae84b8ac10ee329ee56"} Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.790850 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0193a2294d1924a73bd6596cf90ca4f9146a445cf697eae84b8ac10ee329ee56" Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.790903 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564070-rglgp" Mar 18 14:30:05 crc kubenswrapper[4921]: I0318 14:30:05.820911 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnv4z\" (UniqueName: \"kubernetes.io/projected/d9dcd334-bc90-417e-81fb-39563ba30e42-kube-api-access-dnv4z\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:06 crc kubenswrapper[4921]: I0318 14:30:06.636339 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-txjf7"] Mar 18 14:30:06 crc kubenswrapper[4921]: I0318 14:30:06.652908 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564064-txjf7"] Mar 18 14:30:07 crc kubenswrapper[4921]: I0318 14:30:07.227721 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53935d6-fe34-43a1-a940-537ecd9e161a" path="/var/lib/kubelet/pods/b53935d6-fe34-43a1-a940-537ecd9e161a/volumes" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.080897 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.081514 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.081574 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.082582 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.082654 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" gracePeriod=600 Mar 18 14:30:17 crc kubenswrapper[4921]: E0318 14:30:17.217372 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.344427 4921 scope.go:117] "RemoveContainer" containerID="0acccaae3270202bc784c753fc39b93c3c148301e6c00744c4fcc3006dd7fc20" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.375823 4921 scope.go:117] "RemoveContainer" containerID="1a57329cda21ed5521ebe5cebe023dacf1c068944a541bd5d41fdf9ea9432c82" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.940720 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" exitCode=0 Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.940781 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be"} Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.940861 4921 scope.go:117] "RemoveContainer" containerID="6f0ce8bfe0b110a175403a548ad96d572882eed877c07f03ec511ad314b9ca8f" Mar 18 14:30:17 crc kubenswrapper[4921]: I0318 14:30:17.941663 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:30:17 crc kubenswrapper[4921]: E0318 14:30:17.942613 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:30:33 crc kubenswrapper[4921]: I0318 14:30:33.209488 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:30:33 crc kubenswrapper[4921]: E0318 14:30:33.210287 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:30:41 crc kubenswrapper[4921]: I0318 14:30:41.257745 4921 generic.go:334] "Generic (PLEG): container finished" podID="88ac5388-d9ec-408e-89d8-49c6c098a33b" containerID="d6835ce60fc0c7c0a3a459ea2aa9327293f52ba2548d931d57a0dcd3a07d2496" exitCode=2 Mar 18 14:30:41 crc kubenswrapper[4921]: I0318 14:30:41.258366 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" event={"ID":"88ac5388-d9ec-408e-89d8-49c6c098a33b","Type":"ContainerDied","Data":"d6835ce60fc0c7c0a3a459ea2aa9327293f52ba2548d931d57a0dcd3a07d2496"} Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.782439 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.832230 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-combined-ca-bundle\") pod \"88ac5388-d9ec-408e-89d8-49c6c098a33b\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.832290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ceph\") pod \"88ac5388-d9ec-408e-89d8-49c6c098a33b\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.832341 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27whx\" (UniqueName: \"kubernetes.io/projected/88ac5388-d9ec-408e-89d8-49c6c098a33b-kube-api-access-27whx\") pod \"88ac5388-d9ec-408e-89d8-49c6c098a33b\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.832364 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-secret-0\") pod \"88ac5388-d9ec-408e-89d8-49c6c098a33b\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.832440 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ssh-key-openstack-cell1\") pod \"88ac5388-d9ec-408e-89d8-49c6c098a33b\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.832486 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-inventory\") pod \"88ac5388-d9ec-408e-89d8-49c6c098a33b\" (UID: \"88ac5388-d9ec-408e-89d8-49c6c098a33b\") " Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.838876 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "88ac5388-d9ec-408e-89d8-49c6c098a33b" (UID: "88ac5388-d9ec-408e-89d8-49c6c098a33b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.854627 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ceph" (OuterVolumeSpecName: "ceph") pod "88ac5388-d9ec-408e-89d8-49c6c098a33b" (UID: "88ac5388-d9ec-408e-89d8-49c6c098a33b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.869168 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ac5388-d9ec-408e-89d8-49c6c098a33b-kube-api-access-27whx" (OuterVolumeSpecName: "kube-api-access-27whx") pod "88ac5388-d9ec-408e-89d8-49c6c098a33b" (UID: "88ac5388-d9ec-408e-89d8-49c6c098a33b"). InnerVolumeSpecName "kube-api-access-27whx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.873289 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "88ac5388-d9ec-408e-89d8-49c6c098a33b" (UID: "88ac5388-d9ec-408e-89d8-49c6c098a33b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.875261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "88ac5388-d9ec-408e-89d8-49c6c098a33b" (UID: "88ac5388-d9ec-408e-89d8-49c6c098a33b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.878859 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-inventory" (OuterVolumeSpecName: "inventory") pod "88ac5388-d9ec-408e-89d8-49c6c098a33b" (UID: "88ac5388-d9ec-408e-89d8-49c6c098a33b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.939411 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.939446 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.939456 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.939464 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27whx\" (UniqueName: \"kubernetes.io/projected/88ac5388-d9ec-408e-89d8-49c6c098a33b-kube-api-access-27whx\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.939473 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:42 crc kubenswrapper[4921]: I0318 14:30:42.939482 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/88ac5388-d9ec-408e-89d8-49c6c098a33b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:30:43 crc kubenswrapper[4921]: I0318 14:30:43.284455 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" event={"ID":"88ac5388-d9ec-408e-89d8-49c6c098a33b","Type":"ContainerDied","Data":"7cb9e16753a09ac127257c9e0f39cf57b4cd8d9968471971640bbd02934df98c"} Mar 18 14:30:43 crc kubenswrapper[4921]: I0318 14:30:43.284832 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb9e16753a09ac127257c9e0f39cf57b4cd8d9968471971640bbd02934df98c" Mar 18 14:30:43 crc kubenswrapper[4921]: I0318 14:30:43.284532 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gm8q9" Mar 18 14:30:46 crc kubenswrapper[4921]: I0318 14:30:46.209347 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:30:46 crc kubenswrapper[4921]: E0318 14:30:46.210049 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:30:59 crc kubenswrapper[4921]: I0318 14:30:59.209174 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:30:59 crc kubenswrapper[4921]: E0318 14:30:59.210235 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:31:10 crc kubenswrapper[4921]: I0318 14:31:10.209657 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:31:10 crc kubenswrapper[4921]: E0318 14:31:10.210571 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.048294 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9vfff"] Mar 18 14:31:20 crc kubenswrapper[4921]: E0318 14:31:20.049752 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9dcd334-bc90-417e-81fb-39563ba30e42" containerName="oc" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.049776 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dcd334-bc90-417e-81fb-39563ba30e42" containerName="oc" Mar 18 14:31:20 crc kubenswrapper[4921]: E0318 14:31:20.049858 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9135fd81-c79e-455b-a614-593664b770f3" containerName="collect-profiles" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.049871 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9135fd81-c79e-455b-a614-593664b770f3" containerName="collect-profiles" Mar 18 14:31:20 crc kubenswrapper[4921]: E0318 14:31:20.049898 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ac5388-d9ec-408e-89d8-49c6c098a33b" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.049911 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ac5388-d9ec-408e-89d8-49c6c098a33b" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.050297 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ac5388-d9ec-408e-89d8-49c6c098a33b" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.050338 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9dcd334-bc90-417e-81fb-39563ba30e42" containerName="oc" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.050363 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9135fd81-c79e-455b-a614-593664b770f3" containerName="collect-profiles" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.051795 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.054587 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.055305 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.055310 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.057781 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.057841 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-cc8lt" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.071495 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9vfff"] Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.118910 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-inventory\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.118992 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4ph\" (UniqueName: \"kubernetes.io/projected/b747d245-da1c-4df2-b74d-d8f9f226b415-kube-api-access-qd4ph\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.119080 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.119323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ceph\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.119402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.119449 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.221155 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ceph\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.221218 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.221244 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.221319 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-inventory\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.221375 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4ph\" (UniqueName: \"kubernetes.io/projected/b747d245-da1c-4df2-b74d-d8f9f226b415-kube-api-access-qd4ph\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.221420 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.228495 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ceph\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.229631 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.233875 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.236657 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-inventory\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.238724 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.263858 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4ph\" (UniqueName: \"kubernetes.io/projected/b747d245-da1c-4df2-b74d-d8f9f226b415-kube-api-access-qd4ph\") pod \"libvirt-openstack-openstack-cell1-9vfff\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:20 crc kubenswrapper[4921]: I0318 14:31:20.392598 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:31:21 crc kubenswrapper[4921]: I0318 14:31:21.012642 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9vfff"] Mar 18 14:31:21 crc kubenswrapper[4921]: I0318 14:31:21.775836 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" event={"ID":"b747d245-da1c-4df2-b74d-d8f9f226b415","Type":"ContainerStarted","Data":"53f6849ff6338fc1eb69f855db2fb3fe08ad32878ede673dd7b678a1b2706fe7"} Mar 18 14:31:21 crc kubenswrapper[4921]: I0318 14:31:21.776327 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" event={"ID":"b747d245-da1c-4df2-b74d-d8f9f226b415","Type":"ContainerStarted","Data":"b83e1ecca75ef8f6a0b1c1700e56812a4fda1cdcc6bb79158203897f097c0f70"} Mar 18 14:31:21 crc kubenswrapper[4921]: I0318 14:31:21.800953 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" podStartSLOduration=1.630348092 podStartE2EDuration="1.800936973s" podCreationTimestamp="2026-03-18 14:31:20 +0000 UTC" firstStartedPulling="2026-03-18 14:31:21.01876303 +0000 UTC m=+8500.568683669" lastFinishedPulling="2026-03-18 14:31:21.189351911 +0000 UTC m=+8500.739272550" observedRunningTime="2026-03-18 14:31:21.790888422 +0000 UTC m=+8501.340809101" watchObservedRunningTime="2026-03-18 14:31:21.800936973 +0000 UTC m=+8501.350857612" Mar 18 14:31:24 crc kubenswrapper[4921]: I0318 14:31:24.212473 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:31:24 crc kubenswrapper[4921]: E0318 14:31:24.214579 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:31:36 crc kubenswrapper[4921]: I0318 14:31:36.209095 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:31:36 crc kubenswrapper[4921]: E0318 14:31:36.209889 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:31:49 crc kubenswrapper[4921]: I0318 14:31:49.210088 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:31:49 crc kubenswrapper[4921]: E0318 14:31:49.210734 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.168438 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564072-x76mp"] Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.171665 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.174573 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.174669 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.174855 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.185524 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-x76mp"] Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.321697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxpk\" (UniqueName: \"kubernetes.io/projected/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7-kube-api-access-mqxpk\") pod \"auto-csr-approver-29564072-x76mp\" (UID: \"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7\") " pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.424096 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqxpk\" (UniqueName: \"kubernetes.io/projected/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7-kube-api-access-mqxpk\") pod \"auto-csr-approver-29564072-x76mp\" (UID: \"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7\") " pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.445316 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqxpk\" (UniqueName: \"kubernetes.io/projected/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7-kube-api-access-mqxpk\") pod \"auto-csr-approver-29564072-x76mp\" (UID: \"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7\") " pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:00 crc kubenswrapper[4921]: I0318 14:32:00.498020 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:01 crc kubenswrapper[4921]: I0318 14:32:01.019432 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-x76mp"] Mar 18 14:32:01 crc kubenswrapper[4921]: W0318 14:32:01.026308 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aa75a30_2adb_4d3a_a7d5_59e4fcdcb0c7.slice/crio-8b963ca04bc5b2c9ae813351e58acc56223474609e7d80fcf4605d96cf23ed5b WatchSource:0}: Error finding container 8b963ca04bc5b2c9ae813351e58acc56223474609e7d80fcf4605d96cf23ed5b: Status 404 returned error can't find the container with id 8b963ca04bc5b2c9ae813351e58acc56223474609e7d80fcf4605d96cf23ed5b Mar 18 14:32:01 crc kubenswrapper[4921]: I0318 14:32:01.192539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-x76mp" event={"ID":"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7","Type":"ContainerStarted","Data":"8b963ca04bc5b2c9ae813351e58acc56223474609e7d80fcf4605d96cf23ed5b"} Mar 18 14:32:03 crc kubenswrapper[4921]: I0318 14:32:03.209422 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:32:03 crc kubenswrapper[4921]: E0318 14:32:03.210363 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:32:03 crc kubenswrapper[4921]: I0318 14:32:03.220537 4921 generic.go:334] "Generic (PLEG): container finished" podID="3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7" containerID="8dea73a557ed3a534eabcb0f22e513f4166b1d022145746298b79a17899207d6" exitCode=0 Mar 18 14:32:03 crc kubenswrapper[4921]: I0318 14:32:03.226165 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-x76mp" event={"ID":"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7","Type":"ContainerDied","Data":"8dea73a557ed3a534eabcb0f22e513f4166b1d022145746298b79a17899207d6"} Mar 18 14:32:04 crc kubenswrapper[4921]: I0318 14:32:04.635475 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:04 crc kubenswrapper[4921]: I0318 14:32:04.829865 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqxpk\" (UniqueName: \"kubernetes.io/projected/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7-kube-api-access-mqxpk\") pod \"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7\" (UID: \"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7\") " Mar 18 14:32:04 crc kubenswrapper[4921]: I0318 14:32:04.835360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7-kube-api-access-mqxpk" (OuterVolumeSpecName: "kube-api-access-mqxpk") pod "3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7" (UID: "3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7"). InnerVolumeSpecName "kube-api-access-mqxpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:32:04 crc kubenswrapper[4921]: I0318 14:32:04.933006 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqxpk\" (UniqueName: \"kubernetes.io/projected/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7-kube-api-access-mqxpk\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:05 crc kubenswrapper[4921]: I0318 14:32:05.244347 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564072-x76mp" event={"ID":"3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7","Type":"ContainerDied","Data":"8b963ca04bc5b2c9ae813351e58acc56223474609e7d80fcf4605d96cf23ed5b"} Mar 18 14:32:05 crc kubenswrapper[4921]: I0318 14:32:05.244398 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b963ca04bc5b2c9ae813351e58acc56223474609e7d80fcf4605d96cf23ed5b" Mar 18 14:32:05 crc kubenswrapper[4921]: I0318 14:32:05.244400 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564072-x76mp" Mar 18 14:32:05 crc kubenswrapper[4921]: I0318 14:32:05.722756 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-kn9nx"] Mar 18 14:32:05 crc kubenswrapper[4921]: I0318 14:32:05.735730 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564066-kn9nx"] Mar 18 14:32:07 crc kubenswrapper[4921]: I0318 14:32:07.225682 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c3a186-dfc4-4c33-9633-5d120a21f61d" path="/var/lib/kubelet/pods/84c3a186-dfc4-4c33-9633-5d120a21f61d/volumes" Mar 18 14:32:16 crc kubenswrapper[4921]: I0318 14:32:16.211085 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:32:16 crc kubenswrapper[4921]: E0318 14:32:16.212485 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:32:17 crc kubenswrapper[4921]: I0318 14:32:17.551347 4921 scope.go:117] "RemoveContainer" containerID="f54c56ed8a73e1967b29f0b523ff696af5479e56ee3afa510e4fe2e747254faa" Mar 18 14:32:29 crc kubenswrapper[4921]: I0318 14:32:29.209977 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:32:29 crc kubenswrapper[4921]: E0318 14:32:29.211069 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.159098 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6768g"] Mar 18 14:32:41 crc kubenswrapper[4921]: E0318 14:32:41.160235 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7" containerName="oc" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.160320 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7" containerName="oc" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.160583 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7" containerName="oc" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.162300 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.180441 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6768g"] Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.185629 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-catalog-content\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.185764 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-utilities\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.185930 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgm86\" (UniqueName: \"kubernetes.io/projected/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-kube-api-access-pgm86\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.287568 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-catalog-content\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.287738 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-utilities\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.287778 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgm86\" (UniqueName: \"kubernetes.io/projected/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-kube-api-access-pgm86\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.288792 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-utilities\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.288893 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-catalog-content\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.310551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgm86\" (UniqueName: \"kubernetes.io/projected/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-kube-api-access-pgm86\") pod \"redhat-marketplace-6768g\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.520411 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:41 crc kubenswrapper[4921]: I0318 14:32:41.980559 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6768g"] Mar 18 14:32:42 crc kubenswrapper[4921]: I0318 14:32:42.730259 4921 generic.go:334] "Generic (PLEG): container finished" podID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerID="e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87" exitCode=0 Mar 18 14:32:42 crc kubenswrapper[4921]: I0318 14:32:42.730493 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerDied","Data":"e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87"} Mar 18 14:32:42 crc kubenswrapper[4921]: I0318 14:32:42.730588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerStarted","Data":"939a7c78b32d89504336e0ee911c290a74236232fb1f2fa3ae892ef8031b203c"} Mar 18 14:32:44 crc kubenswrapper[4921]: I0318 14:32:44.210754 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:32:44 crc kubenswrapper[4921]: E0318 14:32:44.212232 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:32:44 crc kubenswrapper[4921]: I0318 14:32:44.761830 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerStarted","Data":"168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207"} Mar 18 14:32:45 crc kubenswrapper[4921]: I0318 14:32:45.780957 4921 generic.go:334] "Generic (PLEG): container finished" podID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerID="168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207" exitCode=0 Mar 18 14:32:45 crc kubenswrapper[4921]: I0318 14:32:45.781011 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerDied","Data":"168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207"} Mar 18 14:32:46 crc kubenswrapper[4921]: I0318 14:32:46.805369 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerStarted","Data":"94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc"} Mar 18 14:32:46 crc kubenswrapper[4921]: I0318 14:32:46.853395 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6768g" podStartSLOduration=2.14717737 podStartE2EDuration="5.853362346s" podCreationTimestamp="2026-03-18 14:32:41 +0000 UTC" firstStartedPulling="2026-03-18 14:32:42.732229312 +0000 UTC m=+8582.282149961" lastFinishedPulling="2026-03-18 14:32:46.438414298 +0000 UTC m=+8585.988334937" observedRunningTime="2026-03-18 14:32:46.829920627 +0000 UTC m=+8586.379841316" watchObservedRunningTime="2026-03-18 14:32:46.853362346 +0000 UTC m=+8586.403283035" Mar 18 14:32:51 crc kubenswrapper[4921]: I0318 14:32:51.520583 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:51 crc kubenswrapper[4921]: I0318 14:32:51.523223 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:51 crc kubenswrapper[4921]: I0318 14:32:51.571660 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:51 crc kubenswrapper[4921]: I0318 14:32:51.925976 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:52 crc kubenswrapper[4921]: I0318 14:32:52.005038 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6768g"] Mar 18 14:32:53 crc kubenswrapper[4921]: I0318 14:32:53.900284 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6768g" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="registry-server" containerID="cri-o://94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc" gracePeriod=2 Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.434473 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.593896 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-utilities\") pod \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.594275 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-catalog-content\") pod \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.594603 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgm86\" (UniqueName: \"kubernetes.io/projected/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-kube-api-access-pgm86\") pod \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\" (UID: \"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea\") " Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.596318 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-utilities" (OuterVolumeSpecName: "utilities") pod "5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" (UID: "5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.604142 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-kube-api-access-pgm86" (OuterVolumeSpecName: "kube-api-access-pgm86") pod "5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" (UID: "5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea"). InnerVolumeSpecName "kube-api-access-pgm86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.619337 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" (UID: "5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.697281 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.697320 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgm86\" (UniqueName: \"kubernetes.io/projected/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-kube-api-access-pgm86\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.697336 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.914424 4921 generic.go:334] "Generic (PLEG): container finished" podID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerID="94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc" exitCode=0 Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.914505 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6768g" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.914524 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerDied","Data":"94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc"} Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.914900 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6768g" event={"ID":"5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea","Type":"ContainerDied","Data":"939a7c78b32d89504336e0ee911c290a74236232fb1f2fa3ae892ef8031b203c"} Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.914925 4921 scope.go:117] "RemoveContainer" containerID="94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.962597 4921 scope.go:117] "RemoveContainer" containerID="168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207" Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.968633 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6768g"] Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.981917 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6768g"] Mar 18 14:32:54 crc kubenswrapper[4921]: I0318 14:32:54.989174 4921 scope.go:117] "RemoveContainer" containerID="e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.045548 4921 scope.go:117] "RemoveContainer" containerID="94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc" Mar 18 14:32:55 crc kubenswrapper[4921]: E0318 14:32:55.045990 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc\": container with ID starting with 94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc not found: ID does not exist" containerID="94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.046021 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc"} err="failed to get container status \"94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc\": rpc error: code = NotFound desc = could not find container \"94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc\": container with ID starting with 94dac4d2316c93b17c0201e32cf865f7cfa4f823330d6aa14b51cdc0006f9cfc not found: ID does not exist" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.046043 4921 scope.go:117] "RemoveContainer" containerID="168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207" Mar 18 14:32:55 crc kubenswrapper[4921]: E0318 14:32:55.046374 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207\": container with ID starting with 168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207 not found: ID does not exist" containerID="168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.046398 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207"} err="failed to get container status \"168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207\": rpc error: code = NotFound desc = could not find container \"168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207\": container with ID starting with 168193b5184aed100c79e39cd109771d0b6163d7072b01013ebb2e75548e2207 not found: ID does not exist" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.046409 4921 scope.go:117] "RemoveContainer" containerID="e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87" Mar 18 14:32:55 crc kubenswrapper[4921]: E0318 14:32:55.046751 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87\": container with ID starting with e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87 not found: ID does not exist" containerID="e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.046776 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87"} err="failed to get container status \"e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87\": rpc error: code = NotFound desc = could not find container \"e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87\": container with ID starting with e0dba1831e71e527ea8cd686b6c855ccf6748f1fe4a78528e767e1524bfe3f87 not found: ID does not exist" Mar 18 14:32:55 crc kubenswrapper[4921]: I0318 14:32:55.226060 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" path="/var/lib/kubelet/pods/5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea/volumes" Mar 18 14:32:56 crc kubenswrapper[4921]: I0318 14:32:56.209491 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:32:56 crc kubenswrapper[4921]: E0318 14:32:56.210092 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:33:10 crc kubenswrapper[4921]: I0318 14:33:10.209562 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:33:10 crc kubenswrapper[4921]: E0318 14:33:10.211009 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:33:16 crc kubenswrapper[4921]: I0318 14:33:16.206495 4921 generic.go:334] "Generic (PLEG): container finished" podID="b747d245-da1c-4df2-b74d-d8f9f226b415" containerID="53f6849ff6338fc1eb69f855db2fb3fe08ad32878ede673dd7b678a1b2706fe7" exitCode=2 Mar 18 14:33:16 crc kubenswrapper[4921]: I0318 14:33:16.206573 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" event={"ID":"b747d245-da1c-4df2-b74d-d8f9f226b415","Type":"ContainerDied","Data":"53f6849ff6338fc1eb69f855db2fb3fe08ad32878ede673dd7b678a1b2706fe7"} Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.740686 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.848299 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ceph\") pod \"b747d245-da1c-4df2-b74d-d8f9f226b415\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.848815 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-inventory\") pod \"b747d245-da1c-4df2-b74d-d8f9f226b415\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.848869 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-secret-0\") pod \"b747d245-da1c-4df2-b74d-d8f9f226b415\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.849133 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd4ph\" (UniqueName: \"kubernetes.io/projected/b747d245-da1c-4df2-b74d-d8f9f226b415-kube-api-access-qd4ph\") pod \"b747d245-da1c-4df2-b74d-d8f9f226b415\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.849158 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-combined-ca-bundle\") pod \"b747d245-da1c-4df2-b74d-d8f9f226b415\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.849330 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ssh-key-openstack-cell1\") pod \"b747d245-da1c-4df2-b74d-d8f9f226b415\" (UID: \"b747d245-da1c-4df2-b74d-d8f9f226b415\") " Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.855479 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ceph" (OuterVolumeSpecName: "ceph") pod "b747d245-da1c-4df2-b74d-d8f9f226b415" (UID: "b747d245-da1c-4df2-b74d-d8f9f226b415"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.858395 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b747d245-da1c-4df2-b74d-d8f9f226b415" (UID: "b747d245-da1c-4df2-b74d-d8f9f226b415"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.860001 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b747d245-da1c-4df2-b74d-d8f9f226b415-kube-api-access-qd4ph" (OuterVolumeSpecName: "kube-api-access-qd4ph") pod "b747d245-da1c-4df2-b74d-d8f9f226b415" (UID: "b747d245-da1c-4df2-b74d-d8f9f226b415"). InnerVolumeSpecName "kube-api-access-qd4ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.880466 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-inventory" (OuterVolumeSpecName: "inventory") pod "b747d245-da1c-4df2-b74d-d8f9f226b415" (UID: "b747d245-da1c-4df2-b74d-d8f9f226b415"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.881741 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b747d245-da1c-4df2-b74d-d8f9f226b415" (UID: "b747d245-da1c-4df2-b74d-d8f9f226b415"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.889662 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b747d245-da1c-4df2-b74d-d8f9f226b415" (UID: "b747d245-da1c-4df2-b74d-d8f9f226b415"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.952450 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.952496 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.952510 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd4ph\" (UniqueName: \"kubernetes.io/projected/b747d245-da1c-4df2-b74d-d8f9f226b415-kube-api-access-qd4ph\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.952522 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.952534 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:17 crc kubenswrapper[4921]: I0318 14:33:17.952544 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b747d245-da1c-4df2-b74d-d8f9f226b415-ceph\") on node \"crc\" DevicePath \"\"" Mar 18 14:33:18 crc kubenswrapper[4921]: I0318 14:33:18.232833 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" event={"ID":"b747d245-da1c-4df2-b74d-d8f9f226b415","Type":"ContainerDied","Data":"b83e1ecca75ef8f6a0b1c1700e56812a4fda1cdcc6bb79158203897f097c0f70"} Mar 18 14:33:18 crc kubenswrapper[4921]: I0318 14:33:18.232880 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b83e1ecca75ef8f6a0b1c1700e56812a4fda1cdcc6bb79158203897f097c0f70" Mar 18 14:33:18 crc kubenswrapper[4921]: I0318 14:33:18.232894 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9vfff" Mar 18 14:33:23 crc kubenswrapper[4921]: I0318 14:33:23.210468 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:33:23 crc kubenswrapper[4921]: E0318 14:33:23.211560 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:33:36 crc kubenswrapper[4921]: I0318 14:33:36.209040 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:33:36 crc kubenswrapper[4921]: E0318 14:33:36.210013 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:33:51 crc kubenswrapper[4921]: I0318 14:33:51.222040 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:33:51 crc kubenswrapper[4921]: E0318 14:33:51.223180 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.166558 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564074-k8psk"] Mar 18 14:34:00 crc kubenswrapper[4921]: E0318 14:34:00.168148 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="extract-content" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.168168 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="extract-content" Mar 18 14:34:00 crc kubenswrapper[4921]: E0318 14:34:00.168212 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b747d245-da1c-4df2-b74d-d8f9f226b415" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.168221 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b747d245-da1c-4df2-b74d-d8f9f226b415" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:34:00 crc kubenswrapper[4921]: E0318 14:34:00.168254 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="registry-server" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.168263 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="registry-server" Mar 18 14:34:00 crc kubenswrapper[4921]: E0318 14:34:00.168296 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="extract-utilities" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.168305 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="extract-utilities" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.168610 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc1c1c8-f71c-45ab-b55f-1abc9cf5d4ea" containerName="registry-server" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.168643 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b747d245-da1c-4df2-b74d-d8f9f226b415" containerName="libvirt-openstack-openstack-cell1" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.169882 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.175208 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.175203 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.175399 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.179609 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-k8psk"] Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.217097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwdxm\" (UniqueName: \"kubernetes.io/projected/2468cd73-3363-489d-b4b0-953943ba14e2-kube-api-access-gwdxm\") pod \"auto-csr-approver-29564074-k8psk\" (UID: \"2468cd73-3363-489d-b4b0-953943ba14e2\") " pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.319278 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwdxm\" (UniqueName: \"kubernetes.io/projected/2468cd73-3363-489d-b4b0-953943ba14e2-kube-api-access-gwdxm\") pod \"auto-csr-approver-29564074-k8psk\" (UID: \"2468cd73-3363-489d-b4b0-953943ba14e2\") " pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.358999 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwdxm\" (UniqueName: \"kubernetes.io/projected/2468cd73-3363-489d-b4b0-953943ba14e2-kube-api-access-gwdxm\") pod \"auto-csr-approver-29564074-k8psk\" (UID: \"2468cd73-3363-489d-b4b0-953943ba14e2\") " pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.502273 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:00 crc kubenswrapper[4921]: I0318 14:34:00.984520 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-k8psk"] Mar 18 14:34:01 crc kubenswrapper[4921]: I0318 14:34:01.822429 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-k8psk" event={"ID":"2468cd73-3363-489d-b4b0-953943ba14e2","Type":"ContainerStarted","Data":"01a6d097fa8bc4a0cc8336d4f549850beb659e88440b8cc6613e9e9a38dc3e96"} Mar 18 14:34:06 crc kubenswrapper[4921]: I0318 14:34:06.209820 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:34:06 crc kubenswrapper[4921]: E0318 14:34:06.210678 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:34:06 crc kubenswrapper[4921]: I0318 14:34:06.887278 4921 generic.go:334] "Generic (PLEG): container finished" podID="2468cd73-3363-489d-b4b0-953943ba14e2" containerID="2ae58cfea3b3a2759984450435b61b6a622b5aad35dcae10dafde96682ec845e" exitCode=0 Mar 18 14:34:06 crc kubenswrapper[4921]: I0318 14:34:06.887382 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-k8psk" event={"ID":"2468cd73-3363-489d-b4b0-953943ba14e2","Type":"ContainerDied","Data":"2ae58cfea3b3a2759984450435b61b6a622b5aad35dcae10dafde96682ec845e"} Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.344931 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.502775 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwdxm\" (UniqueName: \"kubernetes.io/projected/2468cd73-3363-489d-b4b0-953943ba14e2-kube-api-access-gwdxm\") pod \"2468cd73-3363-489d-b4b0-953943ba14e2\" (UID: \"2468cd73-3363-489d-b4b0-953943ba14e2\") " Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.508351 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2468cd73-3363-489d-b4b0-953943ba14e2-kube-api-access-gwdxm" (OuterVolumeSpecName: "kube-api-access-gwdxm") pod "2468cd73-3363-489d-b4b0-953943ba14e2" (UID: "2468cd73-3363-489d-b4b0-953943ba14e2"). InnerVolumeSpecName "kube-api-access-gwdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.605202 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwdxm\" (UniqueName: \"kubernetes.io/projected/2468cd73-3363-489d-b4b0-953943ba14e2-kube-api-access-gwdxm\") on node \"crc\" DevicePath \"\"" Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.914746 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564074-k8psk" event={"ID":"2468cd73-3363-489d-b4b0-953943ba14e2","Type":"ContainerDied","Data":"01a6d097fa8bc4a0cc8336d4f549850beb659e88440b8cc6613e9e9a38dc3e96"} Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.914789 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a6d097fa8bc4a0cc8336d4f549850beb659e88440b8cc6613e9e9a38dc3e96" Mar 18 14:34:08 crc kubenswrapper[4921]: I0318 14:34:08.914853 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564074-k8psk" Mar 18 14:34:09 crc kubenswrapper[4921]: I0318 14:34:09.446272 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-kcjv6"] Mar 18 14:34:09 crc kubenswrapper[4921]: I0318 14:34:09.462426 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564068-kcjv6"] Mar 18 14:34:11 crc kubenswrapper[4921]: I0318 14:34:11.233303 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a19f478-e8f6-4959-916f-722781e821e4" path="/var/lib/kubelet/pods/9a19f478-e8f6-4959-916f-722781e821e4/volumes" Mar 18 14:34:17 crc kubenswrapper[4921]: I0318 14:34:17.680944 4921 scope.go:117] "RemoveContainer" containerID="becab43cd91748e0f45434974fe6c5abef0004134dce8b0f172c23892aa1cde8" Mar 18 14:34:18 crc kubenswrapper[4921]: I0318 14:34:18.211521 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:34:18 crc kubenswrapper[4921]: E0318 14:34:18.212044 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.828174 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7752b/must-gather-w5rnd"] Mar 18 14:34:19 crc kubenswrapper[4921]: E0318 14:34:19.829131 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2468cd73-3363-489d-b4b0-953943ba14e2" containerName="oc" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.829144 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2468cd73-3363-489d-b4b0-953943ba14e2" containerName="oc" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.829373 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2468cd73-3363-489d-b4b0-953943ba14e2" containerName="oc" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.831179 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.833142 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7752b"/"kube-root-ca.crt" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.833410 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7752b"/"default-dockercfg-58zfm" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.833580 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7752b"/"openshift-service-ca.crt" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.840695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7752b/must-gather-w5rnd"] Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.877837 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b130adda-ecdd-4197-91c8-39c7817474a8-must-gather-output\") pod \"must-gather-w5rnd\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.878183 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49mh\" (UniqueName: \"kubernetes.io/projected/b130adda-ecdd-4197-91c8-39c7817474a8-kube-api-access-g49mh\") pod \"must-gather-w5rnd\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.978999 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b130adda-ecdd-4197-91c8-39c7817474a8-must-gather-output\") pod \"must-gather-w5rnd\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.979204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49mh\" (UniqueName: \"kubernetes.io/projected/b130adda-ecdd-4197-91c8-39c7817474a8-kube-api-access-g49mh\") pod \"must-gather-w5rnd\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.979599 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b130adda-ecdd-4197-91c8-39c7817474a8-must-gather-output\") pod \"must-gather-w5rnd\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:19 crc kubenswrapper[4921]: I0318 14:34:19.995735 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49mh\" (UniqueName: \"kubernetes.io/projected/b130adda-ecdd-4197-91c8-39c7817474a8-kube-api-access-g49mh\") pod \"must-gather-w5rnd\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:20 crc kubenswrapper[4921]: I0318 14:34:20.186509 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:34:20 crc kubenswrapper[4921]: I0318 14:34:20.666383 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7752b/must-gather-w5rnd"] Mar 18 14:34:20 crc kubenswrapper[4921]: I0318 14:34:20.679401 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:34:21 crc kubenswrapper[4921]: I0318 14:34:21.046407 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/must-gather-w5rnd" event={"ID":"b130adda-ecdd-4197-91c8-39c7817474a8","Type":"ContainerStarted","Data":"9ed4f344fb5f182a1589f38e9944499adb20b5104a39791a952dcdcf9c78437c"} Mar 18 14:34:29 crc kubenswrapper[4921]: I0318 14:34:29.130935 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/must-gather-w5rnd" event={"ID":"b130adda-ecdd-4197-91c8-39c7817474a8","Type":"ContainerStarted","Data":"aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29"} Mar 18 14:34:29 crc kubenswrapper[4921]: I0318 14:34:29.131522 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/must-gather-w5rnd" event={"ID":"b130adda-ecdd-4197-91c8-39c7817474a8","Type":"ContainerStarted","Data":"43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176"} Mar 18 14:34:29 crc kubenswrapper[4921]: I0318 14:34:29.159106 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7752b/must-gather-w5rnd" podStartSLOduration=2.689647332 podStartE2EDuration="10.159082187s" podCreationTimestamp="2026-03-18 14:34:19 +0000 UTC" firstStartedPulling="2026-03-18 14:34:20.679191078 +0000 UTC m=+8680.229111717" lastFinishedPulling="2026-03-18 14:34:28.148625933 +0000 UTC m=+8687.698546572" observedRunningTime="2026-03-18 14:34:29.151813497 +0000 UTC m=+8688.701734176" watchObservedRunningTime="2026-03-18 14:34:29.159082187 +0000 UTC m=+8688.709002836" Mar 18 14:34:30 crc kubenswrapper[4921]: I0318 14:34:30.209389 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:34:30 crc kubenswrapper[4921]: E0318 14:34:30.209944 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:34:33 crc kubenswrapper[4921]: E0318 14:34:33.172619 4921 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.200:47562->38.129.56.200:45633: write tcp 38.129.56.200:47562->38.129.56.200:45633: write: broken pipe Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.578643 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7752b/crc-debug-q5mg6"] Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.580433 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.715820 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58f6l\" (UniqueName: \"kubernetes.io/projected/ec5cf438-45c7-496c-b5f0-e8845e5ef758-kube-api-access-58f6l\") pod \"crc-debug-q5mg6\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.716258 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec5cf438-45c7-496c-b5f0-e8845e5ef758-host\") pod \"crc-debug-q5mg6\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.819104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58f6l\" (UniqueName: \"kubernetes.io/projected/ec5cf438-45c7-496c-b5f0-e8845e5ef758-kube-api-access-58f6l\") pod \"crc-debug-q5mg6\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.819194 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec5cf438-45c7-496c-b5f0-e8845e5ef758-host\") pod \"crc-debug-q5mg6\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.819249 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec5cf438-45c7-496c-b5f0-e8845e5ef758-host\") pod \"crc-debug-q5mg6\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.843074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58f6l\" (UniqueName: \"kubernetes.io/projected/ec5cf438-45c7-496c-b5f0-e8845e5ef758-kube-api-access-58f6l\") pod \"crc-debug-q5mg6\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: I0318 14:34:34.906135 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:34:34 crc kubenswrapper[4921]: W0318 14:34:34.936716 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec5cf438_45c7_496c_b5f0_e8845e5ef758.slice/crio-5087f4839cb17312853466b733a07fc71abcef4e482ba6e1859a20be30cf0ba6 WatchSource:0}: Error finding container 5087f4839cb17312853466b733a07fc71abcef4e482ba6e1859a20be30cf0ba6: Status 404 returned error can't find the container with id 5087f4839cb17312853466b733a07fc71abcef4e482ba6e1859a20be30cf0ba6 Mar 18 14:34:35 crc kubenswrapper[4921]: I0318 14:34:35.198510 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-q5mg6" event={"ID":"ec5cf438-45c7-496c-b5f0-e8845e5ef758","Type":"ContainerStarted","Data":"5087f4839cb17312853466b733a07fc71abcef4e482ba6e1859a20be30cf0ba6"} Mar 18 14:34:36 crc kubenswrapper[4921]: E0318 14:34:36.023642 4921 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.200:47668->38.129.56.200:45633: write tcp 38.129.56.200:47668->38.129.56.200:45633: write: broken pipe Mar 18 14:34:42 crc kubenswrapper[4921]: I0318 14:34:42.209855 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:34:42 crc kubenswrapper[4921]: E0318 14:34:42.210976 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:34:49 crc kubenswrapper[4921]: I0318 14:34:49.403561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-q5mg6" event={"ID":"ec5cf438-45c7-496c-b5f0-e8845e5ef758","Type":"ContainerStarted","Data":"310848713c98dad2c605ceb68f75e6a3cd47434bd384d49e2405858cc9720b45"} Mar 18 14:34:49 crc kubenswrapper[4921]: I0318 14:34:49.421082 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7752b/crc-debug-q5mg6" podStartSLOduration=1.752373377 podStartE2EDuration="15.421064351s" podCreationTimestamp="2026-03-18 14:34:34 +0000 UTC" firstStartedPulling="2026-03-18 14:34:34.939632109 +0000 UTC m=+8694.489552738" lastFinishedPulling="2026-03-18 14:34:48.608323073 +0000 UTC m=+8708.158243712" observedRunningTime="2026-03-18 14:34:49.418408314 +0000 UTC m=+8708.968328953" watchObservedRunningTime="2026-03-18 14:34:49.421064351 +0000 UTC m=+8708.970984990" Mar 18 14:34:57 crc kubenswrapper[4921]: I0318 14:34:57.209680 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:34:57 crc kubenswrapper[4921]: E0318 14:34:57.210425 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:35:10 crc kubenswrapper[4921]: I0318 14:35:10.210374 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:35:10 crc kubenswrapper[4921]: E0318 14:35:10.211196 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:35:14 crc kubenswrapper[4921]: I0318 14:35:14.614570 4921 generic.go:334] "Generic (PLEG): container finished" podID="ec5cf438-45c7-496c-b5f0-e8845e5ef758" containerID="310848713c98dad2c605ceb68f75e6a3cd47434bd384d49e2405858cc9720b45" exitCode=0 Mar 18 14:35:14 crc kubenswrapper[4921]: I0318 14:35:14.614693 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-q5mg6" event={"ID":"ec5cf438-45c7-496c-b5f0-e8845e5ef758","Type":"ContainerDied","Data":"310848713c98dad2c605ceb68f75e6a3cd47434bd384d49e2405858cc9720b45"} Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.774794 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.809282 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7752b/crc-debug-q5mg6"] Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.818467 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7752b/crc-debug-q5mg6"] Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.896857 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58f6l\" (UniqueName: \"kubernetes.io/projected/ec5cf438-45c7-496c-b5f0-e8845e5ef758-kube-api-access-58f6l\") pod \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.897080 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec5cf438-45c7-496c-b5f0-e8845e5ef758-host\") pod \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\" (UID: \"ec5cf438-45c7-496c-b5f0-e8845e5ef758\") " Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.897249 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec5cf438-45c7-496c-b5f0-e8845e5ef758-host" (OuterVolumeSpecName: "host") pod "ec5cf438-45c7-496c-b5f0-e8845e5ef758" (UID: "ec5cf438-45c7-496c-b5f0-e8845e5ef758"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.897684 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec5cf438-45c7-496c-b5f0-e8845e5ef758-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:15 crc kubenswrapper[4921]: I0318 14:35:15.903155 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5cf438-45c7-496c-b5f0-e8845e5ef758-kube-api-access-58f6l" (OuterVolumeSpecName: "kube-api-access-58f6l") pod "ec5cf438-45c7-496c-b5f0-e8845e5ef758" (UID: "ec5cf438-45c7-496c-b5f0-e8845e5ef758"). InnerVolumeSpecName "kube-api-access-58f6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:16 crc kubenswrapper[4921]: I0318 14:35:15.999970 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58f6l\" (UniqueName: \"kubernetes.io/projected/ec5cf438-45c7-496c-b5f0-e8845e5ef758-kube-api-access-58f6l\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:16 crc kubenswrapper[4921]: I0318 14:35:16.637674 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5087f4839cb17312853466b733a07fc71abcef4e482ba6e1859a20be30cf0ba6" Mar 18 14:35:16 crc kubenswrapper[4921]: I0318 14:35:16.637720 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q5mg6" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.033075 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7752b/crc-debug-q62cr"] Mar 18 14:35:17 crc kubenswrapper[4921]: E0318 14:35:17.033596 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5cf438-45c7-496c-b5f0-e8845e5ef758" containerName="container-00" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.033614 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5cf438-45c7-496c-b5f0-e8845e5ef758" containerName="container-00" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.033904 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5cf438-45c7-496c-b5f0-e8845e5ef758" containerName="container-00" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.034803 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.129781 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gvc\" (UniqueName: \"kubernetes.io/projected/3a7071eb-9688-4ba8-be80-f6000b85b875-kube-api-access-j4gvc\") pod \"crc-debug-q62cr\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.129957 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a7071eb-9688-4ba8-be80-f6000b85b875-host\") pod \"crc-debug-q62cr\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.227154 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5cf438-45c7-496c-b5f0-e8845e5ef758" path="/var/lib/kubelet/pods/ec5cf438-45c7-496c-b5f0-e8845e5ef758/volumes" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.232239 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a7071eb-9688-4ba8-be80-f6000b85b875-host\") pod \"crc-debug-q62cr\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.232454 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gvc\" (UniqueName: \"kubernetes.io/projected/3a7071eb-9688-4ba8-be80-f6000b85b875-kube-api-access-j4gvc\") pod \"crc-debug-q62cr\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.232591 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a7071eb-9688-4ba8-be80-f6000b85b875-host\") pod \"crc-debug-q62cr\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.250980 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gvc\" (UniqueName: \"kubernetes.io/projected/3a7071eb-9688-4ba8-be80-f6000b85b875-kube-api-access-j4gvc\") pod \"crc-debug-q62cr\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.353929 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:17 crc kubenswrapper[4921]: I0318 14:35:17.647502 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-q62cr" event={"ID":"3a7071eb-9688-4ba8-be80-f6000b85b875","Type":"ContainerStarted","Data":"426a7c622fdf24daaf067ff0feb2f74139aec785ba9f39011102cec6c9995bf2"} Mar 18 14:35:18 crc kubenswrapper[4921]: I0318 14:35:18.657750 4921 generic.go:334] "Generic (PLEG): container finished" podID="3a7071eb-9688-4ba8-be80-f6000b85b875" containerID="1ce8aac7f4cef5bcf33675b49276f73bd21c4afb6eb438373ac0dd8c69f062a2" exitCode=0 Mar 18 14:35:18 crc kubenswrapper[4921]: I0318 14:35:18.657854 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-q62cr" event={"ID":"3a7071eb-9688-4ba8-be80-f6000b85b875","Type":"ContainerDied","Data":"1ce8aac7f4cef5bcf33675b49276f73bd21c4afb6eb438373ac0dd8c69f062a2"} Mar 18 14:35:18 crc kubenswrapper[4921]: I0318 14:35:18.800154 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7752b/crc-debug-q62cr"] Mar 18 14:35:18 crc kubenswrapper[4921]: I0318 14:35:18.820259 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7752b/crc-debug-q62cr"] Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.766898 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.885505 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a7071eb-9688-4ba8-be80-f6000b85b875-host\") pod \"3a7071eb-9688-4ba8-be80-f6000b85b875\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.885586 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a7071eb-9688-4ba8-be80-f6000b85b875-host" (OuterVolumeSpecName: "host") pod "3a7071eb-9688-4ba8-be80-f6000b85b875" (UID: "3a7071eb-9688-4ba8-be80-f6000b85b875"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.885636 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4gvc\" (UniqueName: \"kubernetes.io/projected/3a7071eb-9688-4ba8-be80-f6000b85b875-kube-api-access-j4gvc\") pod \"3a7071eb-9688-4ba8-be80-f6000b85b875\" (UID: \"3a7071eb-9688-4ba8-be80-f6000b85b875\") " Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.886198 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3a7071eb-9688-4ba8-be80-f6000b85b875-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.891699 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7071eb-9688-4ba8-be80-f6000b85b875-kube-api-access-j4gvc" (OuterVolumeSpecName: "kube-api-access-j4gvc") pod "3a7071eb-9688-4ba8-be80-f6000b85b875" (UID: "3a7071eb-9688-4ba8-be80-f6000b85b875"). InnerVolumeSpecName "kube-api-access-j4gvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:19 crc kubenswrapper[4921]: I0318 14:35:19.988404 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4gvc\" (UniqueName: \"kubernetes.io/projected/3a7071eb-9688-4ba8-be80-f6000b85b875-kube-api-access-j4gvc\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.048297 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7752b/crc-debug-9646c"] Mar 18 14:35:20 crc kubenswrapper[4921]: E0318 14:35:20.048839 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7071eb-9688-4ba8-be80-f6000b85b875" containerName="container-00" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.048866 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7071eb-9688-4ba8-be80-f6000b85b875" containerName="container-00" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.049190 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7071eb-9688-4ba8-be80-f6000b85b875" containerName="container-00" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.050134 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.191910 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgzd\" (UniqueName: \"kubernetes.io/projected/3cd9a495-fadf-4774-b29b-e5f028de12b2-kube-api-access-vtgzd\") pod \"crc-debug-9646c\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.191989 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd9a495-fadf-4774-b29b-e5f028de12b2-host\") pod \"crc-debug-9646c\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.294772 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgzd\" (UniqueName: \"kubernetes.io/projected/3cd9a495-fadf-4774-b29b-e5f028de12b2-kube-api-access-vtgzd\") pod \"crc-debug-9646c\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.294877 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd9a495-fadf-4774-b29b-e5f028de12b2-host\") pod \"crc-debug-9646c\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.295391 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd9a495-fadf-4774-b29b-e5f028de12b2-host\") pod \"crc-debug-9646c\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.315533 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgzd\" (UniqueName: \"kubernetes.io/projected/3cd9a495-fadf-4774-b29b-e5f028de12b2-kube-api-access-vtgzd\") pod \"crc-debug-9646c\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.368202 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.676899 4921 generic.go:334] "Generic (PLEG): container finished" podID="3cd9a495-fadf-4774-b29b-e5f028de12b2" containerID="ff2ba95ba3754dac0cdc262f15847d690b08821f64976106b8302b1640c9ee20" exitCode=0 Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.676998 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-9646c" event={"ID":"3cd9a495-fadf-4774-b29b-e5f028de12b2","Type":"ContainerDied","Data":"ff2ba95ba3754dac0cdc262f15847d690b08821f64976106b8302b1640c9ee20"} Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.677459 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/crc-debug-9646c" event={"ID":"3cd9a495-fadf-4774-b29b-e5f028de12b2","Type":"ContainerStarted","Data":"1c5c3fe22dc41ba685cff3b6a8f8fd70b70867d79bb7bc0e7826d6740879cd4a"} Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.682380 4921 scope.go:117] "RemoveContainer" containerID="1ce8aac7f4cef5bcf33675b49276f73bd21c4afb6eb438373ac0dd8c69f062a2" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.682427 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-q62cr" Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.763140 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7752b/crc-debug-9646c"] Mar 18 14:35:20 crc kubenswrapper[4921]: I0318 14:35:20.772453 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7752b/crc-debug-9646c"] Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.235976 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7071eb-9688-4ba8-be80-f6000b85b875" path="/var/lib/kubelet/pods/3a7071eb-9688-4ba8-be80-f6000b85b875/volumes" Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.820974 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.933299 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtgzd\" (UniqueName: \"kubernetes.io/projected/3cd9a495-fadf-4774-b29b-e5f028de12b2-kube-api-access-vtgzd\") pod \"3cd9a495-fadf-4774-b29b-e5f028de12b2\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.933440 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd9a495-fadf-4774-b29b-e5f028de12b2-host\") pod \"3cd9a495-fadf-4774-b29b-e5f028de12b2\" (UID: \"3cd9a495-fadf-4774-b29b-e5f028de12b2\") " Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.933679 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cd9a495-fadf-4774-b29b-e5f028de12b2-host" (OuterVolumeSpecName: "host") pod "3cd9a495-fadf-4774-b29b-e5f028de12b2" (UID: "3cd9a495-fadf-4774-b29b-e5f028de12b2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.934432 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd9a495-fadf-4774-b29b-e5f028de12b2-host\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:21 crc kubenswrapper[4921]: I0318 14:35:21.940011 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd9a495-fadf-4774-b29b-e5f028de12b2-kube-api-access-vtgzd" (OuterVolumeSpecName: "kube-api-access-vtgzd") pod "3cd9a495-fadf-4774-b29b-e5f028de12b2" (UID: "3cd9a495-fadf-4774-b29b-e5f028de12b2"). InnerVolumeSpecName "kube-api-access-vtgzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:35:22 crc kubenswrapper[4921]: I0318 14:35:22.036055 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtgzd\" (UniqueName: \"kubernetes.io/projected/3cd9a495-fadf-4774-b29b-e5f028de12b2-kube-api-access-vtgzd\") on node \"crc\" DevicePath \"\"" Mar 18 14:35:22 crc kubenswrapper[4921]: I0318 14:35:22.209202 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:35:22 crc kubenswrapper[4921]: I0318 14:35:22.705736 4921 scope.go:117] "RemoveContainer" containerID="ff2ba95ba3754dac0cdc262f15847d690b08821f64976106b8302b1640c9ee20" Mar 18 14:35:22 crc kubenswrapper[4921]: I0318 14:35:22.705761 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/crc-debug-9646c" Mar 18 14:35:22 crc kubenswrapper[4921]: I0318 14:35:22.709627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"d7f9507b3191d3ee30fbb7b57c8db74856ae4b7f60e965956118b9fe19bf769b"} Mar 18 14:35:23 crc kubenswrapper[4921]: I0318 14:35:23.221048 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd9a495-fadf-4774-b29b-e5f028de12b2" path="/var/lib/kubelet/pods/3cd9a495-fadf-4774-b29b-e5f028de12b2/volumes" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.152261 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564076-2tz9f"] Mar 18 14:36:00 crc kubenswrapper[4921]: E0318 14:36:00.153222 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd9a495-fadf-4774-b29b-e5f028de12b2" containerName="container-00" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.153236 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd9a495-fadf-4774-b29b-e5f028de12b2" containerName="container-00" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.153422 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd9a495-fadf-4774-b29b-e5f028de12b2" containerName="container-00" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.154130 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.157993 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.158001 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.158041 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.162182 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-2tz9f"] Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.226566 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqx2\" (UniqueName: \"kubernetes.io/projected/7d796a30-949c-46fe-8cac-21ec75723bc7-kube-api-access-9pqx2\") pod \"auto-csr-approver-29564076-2tz9f\" (UID: \"7d796a30-949c-46fe-8cac-21ec75723bc7\") " pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.328672 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqx2\" (UniqueName: \"kubernetes.io/projected/7d796a30-949c-46fe-8cac-21ec75723bc7-kube-api-access-9pqx2\") pod \"auto-csr-approver-29564076-2tz9f\" (UID: \"7d796a30-949c-46fe-8cac-21ec75723bc7\") " pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.352911 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqx2\" (UniqueName: \"kubernetes.io/projected/7d796a30-949c-46fe-8cac-21ec75723bc7-kube-api-access-9pqx2\") pod \"auto-csr-approver-29564076-2tz9f\" (UID: \"7d796a30-949c-46fe-8cac-21ec75723bc7\") " pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.478316 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:00 crc kubenswrapper[4921]: I0318 14:36:00.972495 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-2tz9f"] Mar 18 14:36:01 crc kubenswrapper[4921]: I0318 14:36:01.729872 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" event={"ID":"7d796a30-949c-46fe-8cac-21ec75723bc7","Type":"ContainerStarted","Data":"d45501b260bd4ec619ed8c68cc0b7639ec0967c93c6503558e0f8a0e70d2338f"} Mar 18 14:36:02 crc kubenswrapper[4921]: I0318 14:36:02.743320 4921 generic.go:334] "Generic (PLEG): container finished" podID="7d796a30-949c-46fe-8cac-21ec75723bc7" containerID="87ed8adc92edd5c8806fff7a76b905f86aa1db5d4e323a21c4d745796bb7865d" exitCode=0 Mar 18 14:36:02 crc kubenswrapper[4921]: I0318 14:36:02.743434 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" event={"ID":"7d796a30-949c-46fe-8cac-21ec75723bc7","Type":"ContainerDied","Data":"87ed8adc92edd5c8806fff7a76b905f86aa1db5d4e323a21c4d745796bb7865d"} Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.151947 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.216866 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqx2\" (UniqueName: \"kubernetes.io/projected/7d796a30-949c-46fe-8cac-21ec75723bc7-kube-api-access-9pqx2\") pod \"7d796a30-949c-46fe-8cac-21ec75723bc7\" (UID: \"7d796a30-949c-46fe-8cac-21ec75723bc7\") " Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.222633 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d796a30-949c-46fe-8cac-21ec75723bc7-kube-api-access-9pqx2" (OuterVolumeSpecName: "kube-api-access-9pqx2") pod "7d796a30-949c-46fe-8cac-21ec75723bc7" (UID: "7d796a30-949c-46fe-8cac-21ec75723bc7"). InnerVolumeSpecName "kube-api-access-9pqx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.320058 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqx2\" (UniqueName: \"kubernetes.io/projected/7d796a30-949c-46fe-8cac-21ec75723bc7-kube-api-access-9pqx2\") on node \"crc\" DevicePath \"\"" Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.792713 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" event={"ID":"7d796a30-949c-46fe-8cac-21ec75723bc7","Type":"ContainerDied","Data":"d45501b260bd4ec619ed8c68cc0b7639ec0967c93c6503558e0f8a0e70d2338f"} Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.793006 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45501b260bd4ec619ed8c68cc0b7639ec0967c93c6503558e0f8a0e70d2338f" Mar 18 14:36:04 crc kubenswrapper[4921]: I0318 14:36:04.793062 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564076-2tz9f" Mar 18 14:36:05 crc kubenswrapper[4921]: I0318 14:36:05.263396 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-rglgp"] Mar 18 14:36:05 crc kubenswrapper[4921]: I0318 14:36:05.279336 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564070-rglgp"] Mar 18 14:36:07 crc kubenswrapper[4921]: I0318 14:36:07.225279 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9dcd334-bc90-417e-81fb-39563ba30e42" path="/var/lib/kubelet/pods/d9dcd334-bc90-417e-81fb-39563ba30e42/volumes" Mar 18 14:36:17 crc kubenswrapper[4921]: I0318 14:36:17.805704 4921 scope.go:117] "RemoveContainer" containerID="1eca05bc1d55d9f308a2e33c61ca885908597a4f8d63d93a936193576bc0da40" Mar 18 14:37:47 crc kubenswrapper[4921]: I0318 14:37:47.081336 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:37:47 crc kubenswrapper[4921]: I0318 14:37:47.081870 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.151161 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564078-kqmdw"] Mar 18 14:38:00 crc kubenswrapper[4921]: E0318 14:38:00.152296 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d796a30-949c-46fe-8cac-21ec75723bc7" containerName="oc" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.152312 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d796a30-949c-46fe-8cac-21ec75723bc7" containerName="oc" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.152588 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d796a30-949c-46fe-8cac-21ec75723bc7" containerName="oc" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.153466 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.156498 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.159031 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.167362 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.167457 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-kqmdw"] Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.345607 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btbnb\" (UniqueName: \"kubernetes.io/projected/8a400f81-0112-4f11-9718-5720e3e6e2ef-kube-api-access-btbnb\") pod \"auto-csr-approver-29564078-kqmdw\" (UID: \"8a400f81-0112-4f11-9718-5720e3e6e2ef\") " pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.447007 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btbnb\" (UniqueName: \"kubernetes.io/projected/8a400f81-0112-4f11-9718-5720e3e6e2ef-kube-api-access-btbnb\") pod \"auto-csr-approver-29564078-kqmdw\" (UID: \"8a400f81-0112-4f11-9718-5720e3e6e2ef\") " pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.467762 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btbnb\" (UniqueName: \"kubernetes.io/projected/8a400f81-0112-4f11-9718-5720e3e6e2ef-kube-api-access-btbnb\") pod \"auto-csr-approver-29564078-kqmdw\" (UID: \"8a400f81-0112-4f11-9718-5720e3e6e2ef\") " pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.476891 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:00 crc kubenswrapper[4921]: I0318 14:38:00.969008 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-kqmdw"] Mar 18 14:38:01 crc kubenswrapper[4921]: I0318 14:38:01.976589 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" event={"ID":"8a400f81-0112-4f11-9718-5720e3e6e2ef","Type":"ContainerStarted","Data":"3dd3e8f1102c2dd96a1a9b7d529db51e26cf0bd2c652e6b89949357c8f05e521"} Mar 18 14:38:02 crc kubenswrapper[4921]: I0318 14:38:02.988807 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" event={"ID":"8a400f81-0112-4f11-9718-5720e3e6e2ef","Type":"ContainerStarted","Data":"e9deb72624c0f7c62095c3b3cd36e3293c7e0bc2439e240c3f2e8356de121c8a"} Mar 18 14:38:03 crc kubenswrapper[4921]: I0318 14:38:03.015522 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" podStartSLOduration=1.629924961 podStartE2EDuration="3.015491708s" podCreationTimestamp="2026-03-18 14:38:00 +0000 UTC" firstStartedPulling="2026-03-18 14:38:00.983005235 +0000 UTC m=+8900.532925874" lastFinishedPulling="2026-03-18 14:38:02.368571972 +0000 UTC m=+8901.918492621" observedRunningTime="2026-03-18 14:38:03.005243831 +0000 UTC m=+8902.555164490" watchObservedRunningTime="2026-03-18 14:38:03.015491708 +0000 UTC m=+8902.565412347" Mar 18 14:38:03 crc kubenswrapper[4921]: I0318 14:38:03.999182 4921 generic.go:334] "Generic (PLEG): container finished" podID="8a400f81-0112-4f11-9718-5720e3e6e2ef" containerID="e9deb72624c0f7c62095c3b3cd36e3293c7e0bc2439e240c3f2e8356de121c8a" exitCode=0 Mar 18 14:38:03 crc kubenswrapper[4921]: I0318 14:38:03.999248 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" event={"ID":"8a400f81-0112-4f11-9718-5720e3e6e2ef","Type":"ContainerDied","Data":"e9deb72624c0f7c62095c3b3cd36e3293c7e0bc2439e240c3f2e8356de121c8a"} Mar 18 14:38:05 crc kubenswrapper[4921]: I0318 14:38:05.714636 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:05 crc kubenswrapper[4921]: I0318 14:38:05.865503 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btbnb\" (UniqueName: \"kubernetes.io/projected/8a400f81-0112-4f11-9718-5720e3e6e2ef-kube-api-access-btbnb\") pod \"8a400f81-0112-4f11-9718-5720e3e6e2ef\" (UID: \"8a400f81-0112-4f11-9718-5720e3e6e2ef\") " Mar 18 14:38:05 crc kubenswrapper[4921]: I0318 14:38:05.872542 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a400f81-0112-4f11-9718-5720e3e6e2ef-kube-api-access-btbnb" (OuterVolumeSpecName: "kube-api-access-btbnb") pod "8a400f81-0112-4f11-9718-5720e3e6e2ef" (UID: "8a400f81-0112-4f11-9718-5720e3e6e2ef"). InnerVolumeSpecName "kube-api-access-btbnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:38:05 crc kubenswrapper[4921]: I0318 14:38:05.968247 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btbnb\" (UniqueName: \"kubernetes.io/projected/8a400f81-0112-4f11-9718-5720e3e6e2ef-kube-api-access-btbnb\") on node \"crc\" DevicePath \"\"" Mar 18 14:38:06 crc kubenswrapper[4921]: I0318 14:38:06.070881 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" event={"ID":"8a400f81-0112-4f11-9718-5720e3e6e2ef","Type":"ContainerDied","Data":"3dd3e8f1102c2dd96a1a9b7d529db51e26cf0bd2c652e6b89949357c8f05e521"} Mar 18 14:38:06 crc kubenswrapper[4921]: I0318 14:38:06.070932 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd3e8f1102c2dd96a1a9b7d529db51e26cf0bd2c652e6b89949357c8f05e521" Mar 18 14:38:06 crc kubenswrapper[4921]: I0318 14:38:06.070959 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564078-kqmdw" Mar 18 14:38:06 crc kubenswrapper[4921]: I0318 14:38:06.084948 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-x76mp"] Mar 18 14:38:06 crc kubenswrapper[4921]: I0318 14:38:06.096224 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564072-x76mp"] Mar 18 14:38:07 crc kubenswrapper[4921]: I0318 14:38:07.225635 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7" path="/var/lib/kubelet/pods/3aa75a30-2adb-4d3a-a7d5-59e4fcdcb0c7/volumes" Mar 18 14:38:17 crc kubenswrapper[4921]: I0318 14:38:17.080700 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:38:17 crc kubenswrapper[4921]: I0318 14:38:17.081220 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:38:18 crc kubenswrapper[4921]: I0318 14:38:18.038010 4921 scope.go:117] "RemoveContainer" containerID="8dea73a557ed3a534eabcb0f22e513f4166b1d022145746298b79a17899207d6" Mar 18 14:38:18 crc kubenswrapper[4921]: I0318 14:38:18.515337 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0f72ab46-f42c-4ec6-8f39-637fb5de6a9a/init-config-reloader/0.log" Mar 18 14:38:18 crc kubenswrapper[4921]: I0318 14:38:18.744617 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0f72ab46-f42c-4ec6-8f39-637fb5de6a9a/config-reloader/0.log" Mar 18 14:38:18 crc kubenswrapper[4921]: I0318 14:38:18.761687 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0f72ab46-f42c-4ec6-8f39-637fb5de6a9a/alertmanager/0.log" Mar 18 14:38:18 crc kubenswrapper[4921]: I0318 14:38:18.777657 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0f72ab46-f42c-4ec6-8f39-637fb5de6a9a/init-config-reloader/0.log" Mar 18 14:38:18 crc kubenswrapper[4921]: I0318 14:38:18.951672 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7dc34189-0095-41c5-8847-f91ddb972ce0/aodh-api/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.002474 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7dc34189-0095-41c5-8847-f91ddb972ce0/aodh-listener/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.016735 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7dc34189-0095-41c5-8847-f91ddb972ce0/aodh-evaluator/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.101953 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7dc34189-0095-41c5-8847-f91ddb972ce0/aodh-notifier/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.183535 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f8cd65696-hnlnk_443d9a97-2a7c-4067-adae-3d0a8f914283/barbican-api/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.256822 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f8cd65696-hnlnk_443d9a97-2a7c-4067-adae-3d0a8f914283/barbican-api-log/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.412084 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f5794fb5d-9w9n9_e679d5e2-0e15-4b0a-bf61-972499585c0b/barbican-keystone-listener/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.504478 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f5794fb5d-9w9n9_e679d5e2-0e15-4b0a-bf61-972499585c0b/barbican-keystone-listener-log/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.567843 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-567d546d79-rgpwb_99927959-518a-4cbe-8e6c-36e060549d91/barbican-worker/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.624237 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-567d546d79-rgpwb_99927959-518a-4cbe-8e6c-36e060549d91/barbican-worker-log/0.log" Mar 18 14:38:19 crc kubenswrapper[4921]: I0318 14:38:19.953015 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c29011d7-f9b6-4594-9817-af2780632e82/ceilometer-central-agent/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.013054 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-lk6pp_ed8f327e-6fda-4c61-b4fa-5aa10d9f7f17/bootstrap-openstack-openstack-cell1/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.131354 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c29011d7-f9b6-4594-9817-af2780632e82/ceilometer-notification-agent/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.140673 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c29011d7-f9b6-4594-9817-af2780632e82/proxy-httpd/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.209159 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c29011d7-f9b6-4594-9817-af2780632e82/sg-core/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.337397 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-cq5hz_b062ff1e-3378-47fb-bd4e-95e3ea289ca8/ceph-client-openstack-openstack-cell1/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.487430 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0de51569-4dc0-4e65-a286-461d82659895/cinder-api/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.612041 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0de51569-4dc0-4e65-a286-461d82659895/cinder-api-log/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.821267 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23/probe/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.824192 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_47dbc0b8-741d-4d4e-ab32-0bf3ab3cca23/cinder-backup/0.log" Mar 18 14:38:20 crc kubenswrapper[4921]: I0318 14:38:20.997463 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c240b2e2-9404-4c37-a827-5183cc419888/cinder-scheduler/0.log" Mar 18 14:38:21 crc kubenswrapper[4921]: I0318 14:38:21.261944 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c240b2e2-9404-4c37-a827-5183cc419888/probe/0.log" Mar 18 14:38:21 crc kubenswrapper[4921]: I0318 14:38:21.626623 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_794e18b3-f1c7-4e09-a80c-7a4b46bd1636/probe/0.log" Mar 18 14:38:21 crc kubenswrapper[4921]: I0318 14:38:21.655826 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_794e18b3-f1c7-4e09-a80c-7a4b46bd1636/cinder-volume/0.log" Mar 18 14:38:21 crc kubenswrapper[4921]: I0318 14:38:21.662136 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-wdz5v_8591ea69-221d-49f3-be00-607522d37c6e/configure-network-openstack-openstack-cell1/0.log" Mar 18 14:38:21 crc kubenswrapper[4921]: I0318 14:38:21.894082 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-df8f9c6bc-rn6dv_ace2674f-ecdd-4b84-850e-f7f40bb91fbc/init/0.log" Mar 18 14:38:21 crc kubenswrapper[4921]: I0318 14:38:21.964423 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-9gzc2_501d322f-bf37-45bc-8604-0099a6408aac/configure-os-openstack-openstack-cell1/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.087195 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-df8f9c6bc-rn6dv_ace2674f-ecdd-4b84-850e-f7f40bb91fbc/init/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.096705 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-df8f9c6bc-rn6dv_ace2674f-ecdd-4b84-850e-f7f40bb91fbc/dnsmasq-dns/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.249648 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-mpg2w_57495fff-4b60-4b94-91ba-f9c34a6afd6c/download-cache-openstack-openstack-cell1/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.332814 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0b42ed4c-386d-4808-924a-2595f4e8c98a/glance-log/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.335777 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0b42ed4c-386d-4808-924a-2595f4e8c98a/glance-httpd/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.481813 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_733aa67f-6cfa-4a18-bc24-44e770159808/glance-httpd/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.484137 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_733aa67f-6cfa-4a18-bc24-44e770159808/glance-log/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.668808 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-58c44c44b9-4hccn_390d1c49-9e71-4990-9fe9-d98fdd866322/heat-api/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.763284 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-57db5945bb-wkc7x_9cd166a7-f46c-4049-a692-3e0d3ba12606/heat-cfnapi/0.log" Mar 18 14:38:22 crc kubenswrapper[4921]: I0318 14:38:22.873535 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-79569bdc57-vlxb2_5876eeea-8bcc-4e40-8775-a9c183f68bed/heat-engine/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.022430 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77cf57cd59-b9zl7_935c08b2-0b06-41d3-809c-55ead7884c9c/horizon-log/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.043552 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-77cf57cd59-b9zl7_935c08b2-0b06-41d3-809c-55ead7884c9c/horizon/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.125718 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-82j2d_71031cc6-4940-49c7-acba-e58212bcf5f4/install-certs-openstack-openstack-cell1/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.424949 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29564041-8wr8v_a078d82d-65b2-4164-8bac-980a6d17780c/keystone-cron/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.461831 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-758655498b-jd8n6_074a690a-ebc7-418c-afb0-abbab8c638d7/keystone-api/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.590362 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-fhklf_978bea76-f0d3-43d3-9a2e-e024cfd086fa/install-os-openstack-openstack-cell1/0.log" Mar 18 14:38:23 crc kubenswrapper[4921]: I0318 14:38:23.644892 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_86806a0f-c6ce-42ec-acde-3919d0c60a65/kube-state-metrics/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.179393 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-9vfff_b747d245-da1c-4df2-b74d-d8f9f226b415/libvirt-openstack-openstack-cell1/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.398372 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-dnq6v_5e5ce1c9-5d40-4815-bab4-c0f2fc073e15/libvirt-openstack-openstack-cell1/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.548859 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_0485f76e-5cf0-460f-9dd7-ffa89b64455e/manila-api/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.589901 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_0485f76e-5cf0-460f-9dd7-ffa89b64455e/manila-api-log/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.611969 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-gm8q9_88ac5388-d9ec-408e-89d8-49c6c098a33b/libvirt-openstack-openstack-cell1/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.728157 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-rlkd2_3c9aa9e7-5524-4d45-8b76-55b2e0beb89e/libvirt-openstack-openstack-cell1/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.809456 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_94baf96e-4d22-40a6-a7d9-f04eb6f694b7/manila-scheduler/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.877334 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_94baf96e-4d22-40a6-a7d9-f04eb6f694b7/probe/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.879471 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_806616a6-2ba4-41b2-80ee-369a45cb1447/manila-share/0.log" Mar 18 14:38:24 crc kubenswrapper[4921]: I0318 14:38:24.983663 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_806616a6-2ba4-41b2-80ee-369a45cb1447/probe/0.log" Mar 18 14:38:25 crc kubenswrapper[4921]: I0318 14:38:25.074556 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_f1fbd8a8-29d9-4209-8f78-5831fdaf6c62/adoption/0.log" Mar 18 14:38:25 crc kubenswrapper[4921]: I0318 14:38:25.467672 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cc84b8dcc-z4hnz_e064cef7-7755-4e07-87e3-52b941cd9ead/neutron-api/0.log" Mar 18 14:38:25 crc kubenswrapper[4921]: I0318 14:38:25.480206 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cc84b8dcc-z4hnz_e064cef7-7755-4e07-87e3-52b941cd9ead/neutron-httpd/0.log" Mar 18 14:38:25 crc kubenswrapper[4921]: I0318 14:38:25.705243 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-nf6lh_07566af1-b678-4f13-ae4e-4a1c78a219fd/neutron-metadata-openstack-openstack-cell1/0.log" Mar 18 14:38:25 crc kubenswrapper[4921]: I0318 14:38:25.913035 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ec734ce5-0367-4e51-9875-3390097b2ebc/nova-api-log/0.log" Mar 18 14:38:25 crc kubenswrapper[4921]: I0318 14:38:25.984868 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ec734ce5-0367-4e51-9875-3390097b2ebc/nova-api-api/0.log" Mar 18 14:38:26 crc kubenswrapper[4921]: I0318 14:38:26.162931 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d8dc3d10-5594-4b62-b4ea-f895dbf5b0e4/nova-cell0-conductor-conductor/0.log" Mar 18 14:38:26 crc kubenswrapper[4921]: I0318 14:38:26.909857 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a0f778c5-fcbe-4592-a869-e6ded2907395/nova-cell1-conductor-conductor/0.log" Mar 18 14:38:26 crc kubenswrapper[4921]: I0318 14:38:26.944172 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ebcbff36-a111-4270-988d-1fbb923c2f47/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 14:38:27 crc kubenswrapper[4921]: I0318 14:38:27.473393 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e1e6617b-aaa3-4d0f-81cb-1149d2be62a6/nova-metadata-log/0.log" Mar 18 14:38:27 crc kubenswrapper[4921]: I0318 14:38:27.482374 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e1e6617b-aaa3-4d0f-81cb-1149d2be62a6/nova-metadata-metadata/0.log" Mar 18 14:38:27 crc kubenswrapper[4921]: I0318 14:38:27.500798 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6f4b2844-289a-49cc-aba0-7b97e9105181/nova-scheduler-scheduler/0.log" Mar 18 14:38:27 crc kubenswrapper[4921]: I0318 14:38:27.672482 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7495847675-nfrgn_d2b144d4-c8fe-40b4-ae86-fce2a83da57b/init/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.042647 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7495847675-nfrgn_d2b144d4-c8fe-40b4-ae86-fce2a83da57b/init/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.070175 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7495847675-nfrgn_d2b144d4-c8fe-40b4-ae86-fce2a83da57b/octavia-api-provider-agent/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.292591 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7495847675-nfrgn_d2b144d4-c8fe-40b4-ae86-fce2a83da57b/octavia-api/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.539337 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8xqfl_6a7589fc-80fa-4f87-988a-4d82d9a208c7/init/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.710013 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8xqfl_6a7589fc-80fa-4f87-988a-4d82d9a208c7/init/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.756522 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8xqfl_6a7589fc-80fa-4f87-988a-4d82d9a208c7/octavia-healthmanager/0.log" Mar 18 14:38:28 crc kubenswrapper[4921]: I0318 14:38:28.850332 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cwbms_fb567059-27c8-4f13-a81b-375261ab860a/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.032986 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cwbms_fb567059-27c8-4f13-a81b-375261ab860a/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.061179 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cwbms_fb567059-27c8-4f13-a81b-375261ab860a/octavia-housekeeping/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.184399 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-qxrzz_8352349b-7195-4927-9502-3a0f15b09da0/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.348001 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-qxrzz_8352349b-7195-4927-9502-3a0f15b09da0/octavia-amphora-httpd/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.382244 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-qxrzz_8352349b-7195-4927-9502-3a0f15b09da0/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.413304 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-4fw67_c7adf497-5641-422f-a7c3-cdeb5f8741bc/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.648329 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-h76fx_3cefc2a0-92a2-454e-91e1-7285f024f7ac/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.704704 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-4fw67_c7adf497-5641-422f-a7c3-cdeb5f8741bc/octavia-rsyslog/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.765337 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-4fw67_c7adf497-5641-422f-a7c3-cdeb5f8741bc/init/0.log" Mar 18 14:38:29 crc kubenswrapper[4921]: I0318 14:38:29.889475 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-h76fx_3cefc2a0-92a2-454e-91e1-7285f024f7ac/init/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.016655 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78d10202-0f8e-4004-aede-0b5ed2c63589/mysql-bootstrap/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.219605 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78d10202-0f8e-4004-aede-0b5ed2c63589/mysql-bootstrap/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.253304 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78d10202-0f8e-4004-aede-0b5ed2c63589/galera/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.276632 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-h76fx_3cefc2a0-92a2-454e-91e1-7285f024f7ac/octavia-worker/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.414938 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_974cac56-e861-4c87-98aa-b5d0d098fa15/mysql-bootstrap/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.609362 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_974cac56-e861-4c87-98aa-b5d0d098fa15/mysql-bootstrap/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.631073 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_974cac56-e861-4c87-98aa-b5d0d098fa15/galera/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.682843 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ac1a56a3-581b-45b6-82f5-88216a7e74fd/openstackclient/0.log" Mar 18 14:38:30 crc kubenswrapper[4921]: I0318 14:38:30.808851 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8mwqw_e532c044-5e87-4338-a3d2-dd43379c2ba8/openstack-network-exporter/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.032629 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-757l2_f54f1467-48d2-424f-b694-485a89daea5d/ovsdb-server-init/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.228144 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-757l2_f54f1467-48d2-424f-b694-485a89daea5d/ovsdb-server-init/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.250155 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-757l2_f54f1467-48d2-424f-b694-485a89daea5d/ovsdb-server/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.324839 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-757l2_f54f1467-48d2-424f-b694-485a89daea5d/ovs-vswitchd/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.453946 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-p4s82_3ed65908-7baa-4263-adf8-14055c9fe856/ovn-controller/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.575533 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_fd7f3739-7f8a-47ee-8fd3-9f8e113ae7ae/adoption/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.718589 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_249b8f41-3ce7-4129-aa35-ac8b35f83aa0/openstack-network-exporter/0.log" Mar 18 14:38:31 crc kubenswrapper[4921]: I0318 14:38:31.836279 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_249b8f41-3ce7-4129-aa35-ac8b35f83aa0/ovn-northd/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.042425 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ce5116f2-ee21-4b85-a2fc-2dac3960be81/openstack-network-exporter/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.111505 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-pxqmq_fd44c11c-7a69-4da3-ab19-6e4a4a3603df/ovn-openstack-openstack-cell1/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.169230 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ce5116f2-ee21-4b85-a2fc-2dac3960be81/ovsdbserver-nb/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.282202 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d2299f72-ad34-47f5-9d33-14e55ac2b36e/openstack-network-exporter/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.352425 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d2299f72-ad34-47f5-9d33-14e55ac2b36e/ovsdbserver-nb/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.540393 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_cf98ae06-916f-4a34-845c-ab36705146c4/ovsdbserver-nb/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.545538 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_cf98ae06-916f-4a34-845c-ab36705146c4/openstack-network-exporter/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.688960 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4a9d4713-c699-4829-85ca-0aea43794238/openstack-network-exporter/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.759799 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4a9d4713-c699-4829-85ca-0aea43794238/ovsdbserver-sb/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.865504 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_0ca0c85e-8f57-406b-a1c8-c439180a084d/openstack-network-exporter/0.log" Mar 18 14:38:32 crc kubenswrapper[4921]: I0318 14:38:32.924586 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_0ca0c85e-8f57-406b-a1c8-c439180a084d/ovsdbserver-sb/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.037982 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_7187adc5-d2b8-41cd-9ad8-08e8457d30e7/openstack-network-exporter/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.095927 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_7187adc5-d2b8-41cd-9ad8-08e8457d30e7/ovsdbserver-sb/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.309559 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c769655b6-7qjkh_da67364d-59f5-4049-b4a8-262b6ccf4eb5/placement-log/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.325072 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-c769655b6-7qjkh_da67364d-59f5-4049-b4a8-262b6ccf4eb5/placement-api/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.483341 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cjt672_717f9e13-67d4-4a9b-a104-b906a75094bf/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.563220 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_097f7983-d88f-4756-af26-aa3ca22d865a/memcached/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.579089 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cd7bae7-174c-4207-979c-7883deaa29fb/init-config-reloader/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.701777 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cd7bae7-174c-4207-979c-7883deaa29fb/init-config-reloader/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.723160 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cd7bae7-174c-4207-979c-7883deaa29fb/config-reloader/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.737843 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cd7bae7-174c-4207-979c-7883deaa29fb/thanos-sidecar/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.747061 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cd7bae7-174c-4207-979c-7883deaa29fb/prometheus/0.log" Mar 18 14:38:33 crc kubenswrapper[4921]: I0318 14:38:33.899673 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_750e5280-bd0c-45da-92c4-4f420995780d/setup-container/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.070827 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_750e5280-bd0c-45da-92c4-4f420995780d/setup-container/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.078806 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_750e5280-bd0c-45da-92c4-4f420995780d/rabbitmq/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.137226 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1565cc76-b9f3-4dbd-b130-bcd4096db6da/setup-container/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.296624 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1565cc76-b9f3-4dbd-b130-bcd4096db6da/setup-container/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.391302 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-wwg7b_f8643aad-c6c3-42fb-85be-8227073e73c1/reboot-os-openstack-openstack-cell1/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.581015 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-lv6w4_0478480b-8d25-41f1-befb-14dfde857b39/run-os-openstack-openstack-cell1/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.646213 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-zhzh5_69262610-6712-4921-8a8f-43713d54e987/ssh-known-hosts-openstack/0.log" Mar 18 14:38:34 crc kubenswrapper[4921]: I0318 14:38:34.885738 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-95t9s_30d774e9-5ee7-42b1-8883-ddedecfcaa13/validate-network-openstack-openstack-cell1/0.log" Mar 18 14:38:36 crc kubenswrapper[4921]: I0318 14:38:36.988559 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-vzmjq_68781f83-55b8-448f-83e1-1981ded6fdd9/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 18 14:38:37 crc kubenswrapper[4921]: I0318 14:38:37.818216 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1565cc76-b9f3-4dbd-b130-bcd4096db6da/rabbitmq/0.log" Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.080830 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.081568 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.081823 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.082701 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7f9507b3191d3ee30fbb7b57c8db74856ae4b7f60e965956118b9fe19bf769b"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.082799 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://d7f9507b3191d3ee30fbb7b57c8db74856ae4b7f60e965956118b9fe19bf769b" gracePeriod=600 Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.496263 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="d7f9507b3191d3ee30fbb7b57c8db74856ae4b7f60e965956118b9fe19bf769b" exitCode=0 Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.496337 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"d7f9507b3191d3ee30fbb7b57c8db74856ae4b7f60e965956118b9fe19bf769b"} Mar 18 14:38:47 crc kubenswrapper[4921]: I0318 14:38:47.497549 4921 scope.go:117] "RemoveContainer" containerID="eda0d5744b346de1fddb26185f9aa42a7a7188ecc21fb58829034a5b3e72a2be" Mar 18 14:38:48 crc kubenswrapper[4921]: I0318 14:38:48.516753 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerStarted","Data":"313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4"} Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.278651 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/util/0.log" Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.502628 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/util/0.log" Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.567225 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/pull/0.log" Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.571752 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/pull/0.log" Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.763197 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/extract/0.log" Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.790853 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/util/0.log" Mar 18 14:39:03 crc kubenswrapper[4921]: I0318 14:39:03.793614 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540ewqqvh_c30738d0-ecb7-42ef-8428-f2f06233f338/pull/0.log" Mar 18 14:39:04 crc kubenswrapper[4921]: I0318 14:39:04.312822 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-rsjkx_e7fc79ba-0394-4b4d-94d3-7fb983330881/manager/0.log" Mar 18 14:39:04 crc kubenswrapper[4921]: I0318 14:39:04.549253 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-2f42g_bc0b0e69-c1d0-4e53-bc54-12ee0ccab318/manager/0.log" Mar 18 14:39:04 crc kubenswrapper[4921]: I0318 14:39:04.831799 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-c7hxm_04deb0db-06f5-428b-8c1f-b1c4585d3b79/manager/0.log" Mar 18 14:39:04 crc kubenswrapper[4921]: I0318 14:39:04.912304 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-nsnz7_aa7a1390-ce2f-4102-b998-d4dcf56abf25/manager/0.log" Mar 18 14:39:05 crc kubenswrapper[4921]: I0318 14:39:05.180255 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-cqxrg_55bf4c3f-da09-440f-9d0d-27942727e7eb/manager/0.log" Mar 18 14:39:05 crc kubenswrapper[4921]: I0318 14:39:05.574279 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-cb29f_865b7d13-bcd2-4ff5-ab6a-70dc8b85206b/manager/0.log" Mar 18 14:39:06 crc kubenswrapper[4921]: I0318 14:39:06.151751 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-wpc8r_0749ae95-942e-4331-bf11-707bb1cc131d/manager/0.log" Mar 18 14:39:06 crc kubenswrapper[4921]: I0318 14:39:06.295026 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-bx4rb_f923626e-2cdd-413f-8b7d-e983841061da/manager/0.log" Mar 18 14:39:06 crc kubenswrapper[4921]: I0318 14:39:06.452506 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-xwbwb_191b3452-75a7-49eb-953f-606943d143eb/manager/0.log" Mar 18 14:39:06 crc kubenswrapper[4921]: I0318 14:39:06.886057 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-png26_670fb623-168c-44fc-a437-daaaa77ea3cd/manager/0.log" Mar 18 14:39:06 crc kubenswrapper[4921]: I0318 14:39:06.893596 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-vj44m_e9b16b0b-6ae7-4f94-947d-d14ccce79710/manager/0.log" Mar 18 14:39:07 crc kubenswrapper[4921]: I0318 14:39:07.655708 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-c4m65_2c163c6b-f034-4ba4-bd7e-ab170b41cc23/manager/0.log" Mar 18 14:39:07 crc kubenswrapper[4921]: I0318 14:39:07.767085 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="78d10202-0f8e-4004-aede-0b5ed2c63589" containerName="galera" probeResult="failure" output="command timed out" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.044516 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-wb587_6f79bf2e-45a8-42d8-a3e5-a5322b80a0ba/manager/0.log" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.049208 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-zst7p_7c559a09-bcdc-4c4d-b326-1e91e920b262/manager/0.log" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.099368 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-2nnr8_a830f92b-2266-4b87-a165-a8db80990181/manager/0.log" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.474344 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68ccf9867-95s7w_bd46d074-f312-48a3-ae07-5889a432d9bd/operator/0.log" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.535180 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-8cvjq_9a5083f2-1437-4efc-b0ed-3e18a4bc8a81/registry-server/0.log" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.752590 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-srlqt_34010ed3-fc84-42ad-9011-160d4a107029/manager/0.log" Mar 18 14:39:08 crc kubenswrapper[4921]: I0318 14:39:08.866408 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-bhx7s_df25990a-3e3f-45f6-9b79-9fb9dc7ee3cb/manager/0.log" Mar 18 14:39:09 crc kubenswrapper[4921]: I0318 14:39:09.430188 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wzcxk_12fa5cc9-3f33-4574-831e-87596175e789/operator/0.log" Mar 18 14:39:09 crc kubenswrapper[4921]: I0318 14:39:09.541265 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-n8b8w_958f7207-3507-4bbc-88ac-4f0e7f19f154/manager/0.log" Mar 18 14:39:10 crc kubenswrapper[4921]: I0318 14:39:10.041296 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-n67kw_ad4ad53b-44b8-46d4-8ef2-04e7859c3e60/manager/0.log" Mar 18 14:39:10 crc kubenswrapper[4921]: I0318 14:39:10.208262 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-ps7nq_49d9ed92-040c-45cd-ba21-a5b96f07fe95/manager/0.log" Mar 18 14:39:10 crc kubenswrapper[4921]: I0318 14:39:10.268432 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-h8b2h_f21bbed2-2ad1-468c-806f-eda2d4f2264e/manager/0.log" Mar 18 14:39:10 crc kubenswrapper[4921]: I0318 14:39:10.599728 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76c5949666-xzxsj_09b26899-4ee2-482d-b190-b57c5d4cdfd3/manager/0.log" Mar 18 14:39:32 crc kubenswrapper[4921]: I0318 14:39:32.769963 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qsrjd_71184a3d-1ecb-41e7-b7ed-9bc3e20131cb/control-plane-machine-set-operator/0.log" Mar 18 14:39:32 crc kubenswrapper[4921]: I0318 14:39:32.987788 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z5bqz_b86f3c6a-1db9-44c8-911c-46647c933bd7/kube-rbac-proxy/0.log" Mar 18 14:39:32 crc kubenswrapper[4921]: I0318 14:39:32.999076 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z5bqz_b86f3c6a-1db9-44c8-911c-46647c933bd7/machine-api-operator/0.log" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.543620 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9fnpq"] Mar 18 14:39:35 crc kubenswrapper[4921]: E0318 14:39:35.544760 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a400f81-0112-4f11-9718-5720e3e6e2ef" containerName="oc" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.544777 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a400f81-0112-4f11-9718-5720e3e6e2ef" containerName="oc" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.545024 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a400f81-0112-4f11-9718-5720e3e6e2ef" containerName="oc" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.546863 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.571725 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fnpq"] Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.630919 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-utilities\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.631142 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-catalog-content\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.631225 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqt4q\" (UniqueName: \"kubernetes.io/projected/7156483d-b623-42ed-82fd-4558d2d7e4c2-kube-api-access-mqt4q\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.732825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-utilities\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.732994 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-catalog-content\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.733072 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqt4q\" (UniqueName: \"kubernetes.io/projected/7156483d-b623-42ed-82fd-4558d2d7e4c2-kube-api-access-mqt4q\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.733471 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-utilities\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.733499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-catalog-content\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.754576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqt4q\" (UniqueName: \"kubernetes.io/projected/7156483d-b623-42ed-82fd-4558d2d7e4c2-kube-api-access-mqt4q\") pod \"certified-operators-9fnpq\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:35 crc kubenswrapper[4921]: I0318 14:39:35.889306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:36 crc kubenswrapper[4921]: I0318 14:39:36.472852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fnpq"] Mar 18 14:39:37 crc kubenswrapper[4921]: I0318 14:39:37.084698 4921 generic.go:334] "Generic (PLEG): container finished" podID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerID="8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c" exitCode=0 Mar 18 14:39:37 crc kubenswrapper[4921]: I0318 14:39:37.084807 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerDied","Data":"8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c"} Mar 18 14:39:37 crc kubenswrapper[4921]: I0318 14:39:37.085059 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerStarted","Data":"33396fec37b3b581b42e94828f21488b7e16db4c0591ebebdff1e6bfbcbad7d2"} Mar 18 14:39:37 crc kubenswrapper[4921]: I0318 14:39:37.086406 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:39:38 crc kubenswrapper[4921]: I0318 14:39:38.100678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerStarted","Data":"7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533"} Mar 18 14:39:40 crc kubenswrapper[4921]: I0318 14:39:40.122715 4921 generic.go:334] "Generic (PLEG): container finished" podID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerID="7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533" exitCode=0 Mar 18 14:39:40 crc kubenswrapper[4921]: I0318 14:39:40.123323 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerDied","Data":"7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533"} Mar 18 14:39:42 crc kubenswrapper[4921]: I0318 14:39:42.148358 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerStarted","Data":"1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781"} Mar 18 14:39:42 crc kubenswrapper[4921]: I0318 14:39:42.168754 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9fnpq" podStartSLOduration=3.633689289 podStartE2EDuration="7.168727919s" podCreationTimestamp="2026-03-18 14:39:35 +0000 UTC" firstStartedPulling="2026-03-18 14:39:37.08609413 +0000 UTC m=+8996.636014769" lastFinishedPulling="2026-03-18 14:39:40.62113276 +0000 UTC m=+9000.171053399" observedRunningTime="2026-03-18 14:39:42.167555465 +0000 UTC m=+9001.717476104" watchObservedRunningTime="2026-03-18 14:39:42.168727919 +0000 UTC m=+9001.718648578" Mar 18 14:39:45 crc kubenswrapper[4921]: I0318 14:39:45.890263 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:45 crc kubenswrapper[4921]: I0318 14:39:45.890829 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:45 crc kubenswrapper[4921]: I0318 14:39:45.948924 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:46 crc kubenswrapper[4921]: I0318 14:39:46.240828 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:46 crc kubenswrapper[4921]: I0318 14:39:46.300163 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fnpq"] Mar 18 14:39:48 crc kubenswrapper[4921]: I0318 14:39:48.407260 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9fnpq" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="registry-server" containerID="cri-o://1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781" gracePeriod=2 Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.269555 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.297286 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqt4q\" (UniqueName: \"kubernetes.io/projected/7156483d-b623-42ed-82fd-4558d2d7e4c2-kube-api-access-mqt4q\") pod \"7156483d-b623-42ed-82fd-4558d2d7e4c2\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.297487 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-utilities\") pod \"7156483d-b623-42ed-82fd-4558d2d7e4c2\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.297649 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-catalog-content\") pod \"7156483d-b623-42ed-82fd-4558d2d7e4c2\" (UID: \"7156483d-b623-42ed-82fd-4558d2d7e4c2\") " Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.300058 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-utilities" (OuterVolumeSpecName: "utilities") pod "7156483d-b623-42ed-82fd-4558d2d7e4c2" (UID: "7156483d-b623-42ed-82fd-4558d2d7e4c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.319329 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7156483d-b623-42ed-82fd-4558d2d7e4c2-kube-api-access-mqt4q" (OuterVolumeSpecName: "kube-api-access-mqt4q") pod "7156483d-b623-42ed-82fd-4558d2d7e4c2" (UID: "7156483d-b623-42ed-82fd-4558d2d7e4c2"). InnerVolumeSpecName "kube-api-access-mqt4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.360878 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7156483d-b623-42ed-82fd-4558d2d7e4c2" (UID: "7156483d-b623-42ed-82fd-4558d2d7e4c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.400314 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.400562 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqt4q\" (UniqueName: \"kubernetes.io/projected/7156483d-b623-42ed-82fd-4558d2d7e4c2-kube-api-access-mqt4q\") on node \"crc\" DevicePath \"\"" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.400646 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7156483d-b623-42ed-82fd-4558d2d7e4c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.421909 4921 generic.go:334] "Generic (PLEG): container finished" podID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerID="1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781" exitCode=0 Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.421949 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerDied","Data":"1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781"} Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.421973 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fnpq" event={"ID":"7156483d-b623-42ed-82fd-4558d2d7e4c2","Type":"ContainerDied","Data":"33396fec37b3b581b42e94828f21488b7e16db4c0591ebebdff1e6bfbcbad7d2"} Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.421990 4921 scope.go:117] "RemoveContainer" containerID="1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.422136 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fnpq" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.443644 4921 scope.go:117] "RemoveContainer" containerID="7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.477328 4921 scope.go:117] "RemoveContainer" containerID="8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.480194 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fnpq"] Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.494823 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9fnpq"] Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.534168 4921 scope.go:117] "RemoveContainer" containerID="1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781" Mar 18 14:39:49 crc kubenswrapper[4921]: E0318 14:39:49.534650 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781\": container with ID starting with 1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781 not found: ID does not exist" containerID="1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.534715 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781"} err="failed to get container status \"1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781\": rpc error: code = NotFound desc = could not find container \"1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781\": container with ID starting with 1508d5f70132b3fce1dee4659bc55e293ccca2b62114afcb7081bb4241f1e781 not found: ID does not exist" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.534798 4921 scope.go:117] "RemoveContainer" containerID="7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533" Mar 18 14:39:49 crc kubenswrapper[4921]: E0318 14:39:49.535265 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533\": container with ID starting with 7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533 not found: ID does not exist" containerID="7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.535317 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533"} err="failed to get container status \"7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533\": rpc error: code = NotFound desc = could not find container \"7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533\": container with ID starting with 7c61a3b12bdcafc757aac70f4ff9a3b9eb256b1488038a3492bdc3ee4bea8533 not found: ID does not exist" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.535336 4921 scope.go:117] "RemoveContainer" containerID="8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c" Mar 18 14:39:49 crc kubenswrapper[4921]: E0318 14:39:49.535734 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c\": container with ID starting with 8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c not found: ID does not exist" containerID="8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c" Mar 18 14:39:49 crc kubenswrapper[4921]: I0318 14:39:49.535782 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c"} err="failed to get container status \"8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c\": rpc error: code = NotFound desc = could not find container \"8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c\": container with ID starting with 8c6f9d5b560135f12c6cdaf00045df58435a6df258d1b42f4a4fa43b5bb83f0c not found: ID does not exist" Mar 18 14:39:50 crc kubenswrapper[4921]: I0318 14:39:50.879165 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-gb4dc_0d72c634-3ba0-4cc0-a0a6-c4eef4a403a9/cert-manager-controller/0.log" Mar 18 14:39:51 crc kubenswrapper[4921]: I0318 14:39:51.225641 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" path="/var/lib/kubelet/pods/7156483d-b623-42ed-82fd-4558d2d7e4c2/volumes" Mar 18 14:39:52 crc kubenswrapper[4921]: I0318 14:39:52.686920 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-qmjjj_aca422c3-e9e3-435e-8b17-8da14882eaae/cert-manager-webhook/0.log" Mar 18 14:39:52 crc kubenswrapper[4921]: I0318 14:39:52.718918 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-bzfzt_a64439b0-8ac8-43aa-addd-e6cdb72bf3f4/cert-manager-cainjector/0.log" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.197260 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564080-h2n9p"] Mar 18 14:40:00 crc kubenswrapper[4921]: E0318 14:40:00.198255 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="extract-content" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.198268 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="extract-content" Mar 18 14:40:00 crc kubenswrapper[4921]: E0318 14:40:00.198284 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="extract-utilities" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.198291 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="extract-utilities" Mar 18 14:40:00 crc kubenswrapper[4921]: E0318 14:40:00.198304 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.198310 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.198526 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7156483d-b623-42ed-82fd-4558d2d7e4c2" containerName="registry-server" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.199286 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.202217 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.202556 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.202729 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.207087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-h2n9p"] Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.333936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7b25\" (UniqueName: \"kubernetes.io/projected/863fca1b-c47d-4eff-bb21-65125bd4e2f8-kube-api-access-p7b25\") pod \"auto-csr-approver-29564080-h2n9p\" (UID: \"863fca1b-c47d-4eff-bb21-65125bd4e2f8\") " pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.435459 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7b25\" (UniqueName: \"kubernetes.io/projected/863fca1b-c47d-4eff-bb21-65125bd4e2f8-kube-api-access-p7b25\") pod \"auto-csr-approver-29564080-h2n9p\" (UID: \"863fca1b-c47d-4eff-bb21-65125bd4e2f8\") " pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.454482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7b25\" (UniqueName: \"kubernetes.io/projected/863fca1b-c47d-4eff-bb21-65125bd4e2f8-kube-api-access-p7b25\") pod \"auto-csr-approver-29564080-h2n9p\" (UID: \"863fca1b-c47d-4eff-bb21-65125bd4e2f8\") " pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:00 crc kubenswrapper[4921]: I0318 14:40:00.528710 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:01 crc kubenswrapper[4921]: I0318 14:40:01.008561 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-h2n9p"] Mar 18 14:40:01 crc kubenswrapper[4921]: I0318 14:40:01.806396 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" event={"ID":"863fca1b-c47d-4eff-bb21-65125bd4e2f8","Type":"ContainerStarted","Data":"f4b5ccc5df6061e9d0024c9e4e1f8fe8b84f465584dd00659cde1daa7bdac3fd"} Mar 18 14:40:02 crc kubenswrapper[4921]: I0318 14:40:02.817719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" event={"ID":"863fca1b-c47d-4eff-bb21-65125bd4e2f8","Type":"ContainerStarted","Data":"a212c7e585bfb339dd9a7174c150ef8729a378390b860ef8be2d98e341b50d32"} Mar 18 14:40:02 crc kubenswrapper[4921]: I0318 14:40:02.832997 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" podStartSLOduration=1.8219940239999999 podStartE2EDuration="2.832980174s" podCreationTimestamp="2026-03-18 14:40:00 +0000 UTC" firstStartedPulling="2026-03-18 14:40:01.024292262 +0000 UTC m=+9020.574212901" lastFinishedPulling="2026-03-18 14:40:02.035278402 +0000 UTC m=+9021.585199051" observedRunningTime="2026-03-18 14:40:02.832252593 +0000 UTC m=+9022.382173232" watchObservedRunningTime="2026-03-18 14:40:02.832980174 +0000 UTC m=+9022.382900833" Mar 18 14:40:03 crc kubenswrapper[4921]: I0318 14:40:03.829565 4921 generic.go:334] "Generic (PLEG): container finished" podID="863fca1b-c47d-4eff-bb21-65125bd4e2f8" containerID="a212c7e585bfb339dd9a7174c150ef8729a378390b860ef8be2d98e341b50d32" exitCode=0 Mar 18 14:40:03 crc kubenswrapper[4921]: I0318 14:40:03.829637 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" event={"ID":"863fca1b-c47d-4eff-bb21-65125bd4e2f8","Type":"ContainerDied","Data":"a212c7e585bfb339dd9a7174c150ef8729a378390b860ef8be2d98e341b50d32"} Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.319424 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.486548 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7b25\" (UniqueName: \"kubernetes.io/projected/863fca1b-c47d-4eff-bb21-65125bd4e2f8-kube-api-access-p7b25\") pod \"863fca1b-c47d-4eff-bb21-65125bd4e2f8\" (UID: \"863fca1b-c47d-4eff-bb21-65125bd4e2f8\") " Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.493300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863fca1b-c47d-4eff-bb21-65125bd4e2f8-kube-api-access-p7b25" (OuterVolumeSpecName: "kube-api-access-p7b25") pod "863fca1b-c47d-4eff-bb21-65125bd4e2f8" (UID: "863fca1b-c47d-4eff-bb21-65125bd4e2f8"). InnerVolumeSpecName "kube-api-access-p7b25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.589601 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7b25\" (UniqueName: \"kubernetes.io/projected/863fca1b-c47d-4eff-bb21-65125bd4e2f8-kube-api-access-p7b25\") on node \"crc\" DevicePath \"\"" Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.850913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" event={"ID":"863fca1b-c47d-4eff-bb21-65125bd4e2f8","Type":"ContainerDied","Data":"f4b5ccc5df6061e9d0024c9e4e1f8fe8b84f465584dd00659cde1daa7bdac3fd"} Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.851224 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b5ccc5df6061e9d0024c9e4e1f8fe8b84f465584dd00659cde1daa7bdac3fd" Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.851150 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564080-h2n9p" Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.973385 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-k8psk"] Mar 18 14:40:05 crc kubenswrapper[4921]: I0318 14:40:05.990531 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564074-k8psk"] Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.180699 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-l5tc4_e5a4eba4-f1c1-41ee-ac96-55385b0b77b4/nmstate-console-plugin/0.log" Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.220250 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2468cd73-3363-489d-b4b0-953943ba14e2" path="/var/lib/kubelet/pods/2468cd73-3363-489d-b4b0-953943ba14e2/volumes" Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.344674 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qvc7c_a59a7883-d2d3-4f0f-bef7-afc18a6ab54e/nmstate-handler/0.log" Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.385922 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-pfrql_3d40f204-447a-4c6d-b289-8b9d21583b02/kube-rbac-proxy/0.log" Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.500556 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-pfrql_3d40f204-447a-4c6d-b289-8b9d21583b02/nmstate-metrics/0.log" Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.606835 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hw58p_c961752a-a620-4a3b-bbb3-10da43ae4a59/nmstate-operator/0.log" Mar 18 14:40:07 crc kubenswrapper[4921]: I0318 14:40:07.736729 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-w8sm5_11630f15-be33-4a14-9100-4d20eace4502/nmstate-webhook/0.log" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.286979 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pvjc"] Mar 18 14:40:17 crc kubenswrapper[4921]: E0318 14:40:17.288103 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863fca1b-c47d-4eff-bb21-65125bd4e2f8" containerName="oc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.288134 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="863fca1b-c47d-4eff-bb21-65125bd4e2f8" containerName="oc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.288397 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="863fca1b-c47d-4eff-bb21-65125bd4e2f8" containerName="oc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.290761 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.300493 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pvjc"] Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.442417 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwwpv\" (UniqueName: \"kubernetes.io/projected/9e6ee229-34a6-426e-9367-ebb90c358d4b-kube-api-access-lwwpv\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.442464 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-catalog-content\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.442490 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-utilities\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.544431 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwwpv\" (UniqueName: \"kubernetes.io/projected/9e6ee229-34a6-426e-9367-ebb90c358d4b-kube-api-access-lwwpv\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.544470 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-catalog-content\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.544495 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-utilities\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.545002 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-catalog-content\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.545039 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-utilities\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.601879 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwwpv\" (UniqueName: \"kubernetes.io/projected/9e6ee229-34a6-426e-9367-ebb90c358d4b-kube-api-access-lwwpv\") pod \"community-operators-4pvjc\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:17 crc kubenswrapper[4921]: I0318 14:40:17.653535 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:18 crc kubenswrapper[4921]: I0318 14:40:18.191260 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pvjc"] Mar 18 14:40:18 crc kubenswrapper[4921]: W0318 14:40:18.197494 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6ee229_34a6_426e_9367_ebb90c358d4b.slice/crio-249d78d6481ff758eb2cadee9162aad35f991dee57c4a96b2aae3462fc59c8e1 WatchSource:0}: Error finding container 249d78d6481ff758eb2cadee9162aad35f991dee57c4a96b2aae3462fc59c8e1: Status 404 returned error can't find the container with id 249d78d6481ff758eb2cadee9162aad35f991dee57c4a96b2aae3462fc59c8e1 Mar 18 14:40:18 crc kubenswrapper[4921]: I0318 14:40:18.400078 4921 scope.go:117] "RemoveContainer" containerID="2ae58cfea3b3a2759984450435b61b6a622b5aad35dcae10dafde96682ec845e" Mar 18 14:40:18 crc kubenswrapper[4921]: I0318 14:40:18.977061 4921 generic.go:334] "Generic (PLEG): container finished" podID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerID="beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608" exitCode=0 Mar 18 14:40:18 crc kubenswrapper[4921]: I0318 14:40:18.977233 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerDied","Data":"beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608"} Mar 18 14:40:18 crc kubenswrapper[4921]: I0318 14:40:18.977442 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerStarted","Data":"249d78d6481ff758eb2cadee9162aad35f991dee57c4a96b2aae3462fc59c8e1"} Mar 18 14:40:21 crc kubenswrapper[4921]: I0318 14:40:21.000588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerStarted","Data":"178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf"} Mar 18 14:40:22 crc kubenswrapper[4921]: I0318 14:40:22.011813 4921 generic.go:334] "Generic (PLEG): container finished" podID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerID="178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf" exitCode=0 Mar 18 14:40:22 crc kubenswrapper[4921]: I0318 14:40:22.011875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerDied","Data":"178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf"} Mar 18 14:40:23 crc kubenswrapper[4921]: I0318 14:40:23.024796 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerStarted","Data":"93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e"} Mar 18 14:40:23 crc kubenswrapper[4921]: I0318 14:40:23.045813 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pvjc" podStartSLOduration=2.5860430660000002 podStartE2EDuration="6.045788115s" podCreationTimestamp="2026-03-18 14:40:17 +0000 UTC" firstStartedPulling="2026-03-18 14:40:18.980952931 +0000 UTC m=+9038.530873570" lastFinishedPulling="2026-03-18 14:40:22.44069798 +0000 UTC m=+9041.990618619" observedRunningTime="2026-03-18 14:40:23.042192591 +0000 UTC m=+9042.592113230" watchObservedRunningTime="2026-03-18 14:40:23.045788115 +0000 UTC m=+9042.595708754" Mar 18 14:40:25 crc kubenswrapper[4921]: I0318 14:40:25.512983 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5c775898b6-28rgz_bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8/prometheus-operator-admission-webhook/0.log" Mar 18 14:40:25 crc kubenswrapper[4921]: I0318 14:40:25.533053 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-qhmx7_c975b167-f2f3-4d74-9774-8f1734dac995/prometheus-operator/0.log" Mar 18 14:40:25 crc kubenswrapper[4921]: I0318 14:40:25.839959 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x_545fdd49-888f-466d-a528-ed4c2a57ce42/prometheus-operator-admission-webhook/0.log" Mar 18 14:40:25 crc kubenswrapper[4921]: I0318 14:40:25.915694 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-bbtjj_7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f/operator/0.log" Mar 18 14:40:26 crc kubenswrapper[4921]: I0318 14:40:26.143440 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-59cbb5b9bc-jz6kn_0b764acb-5fd6-4b33-b653-597e0b72d927/perses-operator/0.log" Mar 18 14:40:27 crc kubenswrapper[4921]: I0318 14:40:27.654165 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:27 crc kubenswrapper[4921]: I0318 14:40:27.654486 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:27 crc kubenswrapper[4921]: I0318 14:40:27.734983 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:28 crc kubenswrapper[4921]: I0318 14:40:28.113599 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:28 crc kubenswrapper[4921]: I0318 14:40:28.171688 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pvjc"] Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.086174 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4pvjc" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="registry-server" containerID="cri-o://93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e" gracePeriod=2 Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.608237 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.771839 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-catalog-content\") pod \"9e6ee229-34a6-426e-9367-ebb90c358d4b\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.771913 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwwpv\" (UniqueName: \"kubernetes.io/projected/9e6ee229-34a6-426e-9367-ebb90c358d4b-kube-api-access-lwwpv\") pod \"9e6ee229-34a6-426e-9367-ebb90c358d4b\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.772138 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-utilities\") pod \"9e6ee229-34a6-426e-9367-ebb90c358d4b\" (UID: \"9e6ee229-34a6-426e-9367-ebb90c358d4b\") " Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.773277 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-utilities" (OuterVolumeSpecName: "utilities") pod "9e6ee229-34a6-426e-9367-ebb90c358d4b" (UID: "9e6ee229-34a6-426e-9367-ebb90c358d4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.779548 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6ee229-34a6-426e-9367-ebb90c358d4b-kube-api-access-lwwpv" (OuterVolumeSpecName: "kube-api-access-lwwpv") pod "9e6ee229-34a6-426e-9367-ebb90c358d4b" (UID: "9e6ee229-34a6-426e-9367-ebb90c358d4b"). InnerVolumeSpecName "kube-api-access-lwwpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.855789 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e6ee229-34a6-426e-9367-ebb90c358d4b" (UID: "9e6ee229-34a6-426e-9367-ebb90c358d4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.874826 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwwpv\" (UniqueName: \"kubernetes.io/projected/9e6ee229-34a6-426e-9367-ebb90c358d4b-kube-api-access-lwwpv\") on node \"crc\" DevicePath \"\"" Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.874861 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:40:30 crc kubenswrapper[4921]: I0318 14:40:30.874871 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e6ee229-34a6-426e-9367-ebb90c358d4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.100375 4921 generic.go:334] "Generic (PLEG): container finished" podID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerID="93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e" exitCode=0 Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.100424 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerDied","Data":"93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e"} Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.100460 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pvjc" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.100837 4921 scope.go:117] "RemoveContainer" containerID="93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.100714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pvjc" event={"ID":"9e6ee229-34a6-426e-9367-ebb90c358d4b","Type":"ContainerDied","Data":"249d78d6481ff758eb2cadee9162aad35f991dee57c4a96b2aae3462fc59c8e1"} Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.137312 4921 scope.go:117] "RemoveContainer" containerID="178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.138537 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pvjc"] Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.184672 4921 scope.go:117] "RemoveContainer" containerID="beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.187533 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4pvjc"] Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.223500 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" path="/var/lib/kubelet/pods/9e6ee229-34a6-426e-9367-ebb90c358d4b/volumes" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.228981 4921 scope.go:117] "RemoveContainer" containerID="93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e" Mar 18 14:40:31 crc kubenswrapper[4921]: E0318 14:40:31.229360 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e\": container with ID starting with 93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e not found: ID does not exist" containerID="93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.229391 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e"} err="failed to get container status \"93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e\": rpc error: code = NotFound desc = could not find container \"93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e\": container with ID starting with 93180eb07fed14893e8f22001830c4b45684fe9ea1f66e9dfef8ba85e912734e not found: ID does not exist" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.229412 4921 scope.go:117] "RemoveContainer" containerID="178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf" Mar 18 14:40:31 crc kubenswrapper[4921]: E0318 14:40:31.229693 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf\": container with ID starting with 178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf not found: ID does not exist" containerID="178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.229714 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf"} err="failed to get container status \"178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf\": rpc error: code = NotFound desc = could not find container \"178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf\": container with ID starting with 178d9c859d904671315ae67cea41fd7b29685ba8cc1e519beaee29e29ebbddaf not found: ID does not exist" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.229727 4921 scope.go:117] "RemoveContainer" containerID="beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608" Mar 18 14:40:31 crc kubenswrapper[4921]: E0318 14:40:31.229968 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608\": container with ID starting with beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608 not found: ID does not exist" containerID="beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608" Mar 18 14:40:31 crc kubenswrapper[4921]: I0318 14:40:31.229985 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608"} err="failed to get container status \"beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608\": rpc error: code = NotFound desc = could not find container \"beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608\": container with ID starting with beda4b65145e650bf583849edf7f2b4a1db7853e6a9b80f40ab3121164b6f608 not found: ID does not exist" Mar 18 14:40:40 crc kubenswrapper[4921]: I0318 14:40:40.338836 4921 patch_prober.go:28] interesting pod/route-controller-manager-7c8fdb45bd-zhx4b container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: i/o timeout" start-of-body= Mar 18 14:40:40 crc kubenswrapper[4921]: I0318 14:40:40.339467 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" podUID="82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: i/o timeout" Mar 18 14:40:40 crc kubenswrapper[4921]: I0318 14:40:40.357588 4921 patch_prober.go:28] interesting pod/route-controller-manager-7c8fdb45bd-zhx4b container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": context deadline exceeded" start-of-body= Mar 18 14:40:40 crc kubenswrapper[4921]: I0318 14:40:40.357666 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c8fdb45bd-zhx4b" podUID="82f4dc40-edd9-4add-9e4a-ac9cf4f9e7c5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": context deadline exceeded" Mar 18 14:40:44 crc kubenswrapper[4921]: I0318 14:40:44.357202 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vqkhh_9f3127ca-083e-4013-8b41-1981194a7624/kube-rbac-proxy/0.log" Mar 18 14:40:44 crc kubenswrapper[4921]: I0318 14:40:44.778918 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-vqkhh_9f3127ca-083e-4013-8b41-1981194a7624/controller/0.log" Mar 18 14:40:44 crc kubenswrapper[4921]: I0318 14:40:44.889778 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-frr-files/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.103844 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-reloader/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.105070 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-frr-files/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.154157 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-reloader/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.189302 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-metrics/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.465140 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-metrics/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.493299 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-reloader/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.515867 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-frr-files/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.533499 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-metrics/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.700089 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-frr-files/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.780006 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-reloader/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.808236 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/cp-metrics/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.862098 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/controller/0.log" Mar 18 14:40:45 crc kubenswrapper[4921]: I0318 14:40:45.977779 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/frr-metrics/0.log" Mar 18 14:40:46 crc kubenswrapper[4921]: I0318 14:40:46.152318 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/kube-rbac-proxy/0.log" Mar 18 14:40:46 crc kubenswrapper[4921]: I0318 14:40:46.207725 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/kube-rbac-proxy-frr/0.log" Mar 18 14:40:46 crc kubenswrapper[4921]: I0318 14:40:46.253845 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/reloader/0.log" Mar 18 14:40:46 crc kubenswrapper[4921]: I0318 14:40:46.993721 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-nbvzr_215630e2-ecef-4272-889c-f4ef039f4eab/frr-k8s-webhook-server/0.log" Mar 18 14:40:47 crc kubenswrapper[4921]: I0318 14:40:47.186651 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b768b9d77-lrsfq_67daa5a5-ba64-4de4-95c8-d3217b539a80/manager/0.log" Mar 18 14:40:47 crc kubenswrapper[4921]: I0318 14:40:47.375882 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-584c55bff8-7t5c4_2af7d7fb-43a0-4481-ad60-17d1448df801/webhook-server/0.log" Mar 18 14:40:47 crc kubenswrapper[4921]: I0318 14:40:47.533034 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7rj4p_d09ed9aa-00f0-4f1b-b259-c2dca6943b9d/kube-rbac-proxy/0.log" Mar 18 14:40:48 crc kubenswrapper[4921]: I0318 14:40:48.484427 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7rj4p_d09ed9aa-00f0-4f1b-b259-c2dca6943b9d/speaker/0.log" Mar 18 14:40:49 crc kubenswrapper[4921]: I0318 14:40:49.524454 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b2vvr_53ad344e-7f2e-4dae-ab48-0c2eb650c6e6/frr/0.log" Mar 18 14:41:03 crc kubenswrapper[4921]: I0318 14:41:03.643551 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/util/0.log" Mar 18 14:41:03 crc kubenswrapper[4921]: I0318 14:41:03.863665 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/util/0.log" Mar 18 14:41:03 crc kubenswrapper[4921]: I0318 14:41:03.882184 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/pull/0.log" Mar 18 14:41:03 crc kubenswrapper[4921]: I0318 14:41:03.883190 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/pull/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.093861 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/pull/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.156017 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/extract/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.162413 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874q76kz_af17cbc8-f48a-499e-af36-8f4420f43e4d/util/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.316232 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/util/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.491991 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/pull/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.530302 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/util/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.556272 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/pull/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.687760 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/pull/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.696891 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/util/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.714647 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18czns_0a4d9566-78f0-4989-be26-efb5416bcbac/extract/0.log" Mar 18 14:41:04 crc kubenswrapper[4921]: I0318 14:41:04.917023 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/util/0.log" Mar 18 14:41:05 crc kubenswrapper[4921]: I0318 14:41:05.027153 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/util/0.log" Mar 18 14:41:05 crc kubenswrapper[4921]: I0318 14:41:05.094523 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/pull/0.log" Mar 18 14:41:05 crc kubenswrapper[4921]: I0318 14:41:05.116925 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/pull/0.log" Mar 18 14:41:05 crc kubenswrapper[4921]: I0318 14:41:05.284035 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/util/0.log" Mar 18 14:41:05 crc kubenswrapper[4921]: I0318 14:41:05.319431 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/pull/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.131840 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vs58c_139d771c-f615-472e-9ac9-ce64e4f543bf/extract/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.132255 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/util/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.295160 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/pull/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.403404 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/pull/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.430883 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/util/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.609398 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/util/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.641784 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/pull/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.674670 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726xhljr_f14ebc50-775b-49df-85a7-f3e9c736b149/extract/0.log" Mar 18 14:41:06 crc kubenswrapper[4921]: I0318 14:41:06.836978 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/extract-utilities/0.log" Mar 18 14:41:07 crc kubenswrapper[4921]: I0318 14:41:07.042795 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/extract-content/0.log" Mar 18 14:41:07 crc kubenswrapper[4921]: I0318 14:41:07.078715 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/extract-utilities/0.log" Mar 18 14:41:07 crc kubenswrapper[4921]: I0318 14:41:07.081366 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/extract-content/0.log" Mar 18 14:41:07 crc kubenswrapper[4921]: I0318 14:41:07.262136 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/extract-utilities/0.log" Mar 18 14:41:07 crc kubenswrapper[4921]: I0318 14:41:07.312673 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/extract-content/0.log" Mar 18 14:41:07 crc kubenswrapper[4921]: I0318 14:41:07.492561 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/extract-utilities/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.249640 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/extract-content/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.276538 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/extract-utilities/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.339665 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/extract-content/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.504935 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dk9qh_1dfe7c23-1893-4da0-a65b-dfb1c0da89e8/registry-server/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.706956 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/extract-content/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.743648 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/extract-utilities/0.log" Mar 18 14:41:08 crc kubenswrapper[4921]: I0318 14:41:08.910346 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mbf4l_7908e95d-6c90-41f8-924d-072fd69e70c6/marketplace-operator/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.011124 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/extract-utilities/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.249602 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/extract-content/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.294760 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/extract-utilities/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.295507 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/extract-content/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.558913 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/extract-utilities/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.625398 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/extract-content/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.756181 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/extract-utilities/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.914956 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p4d9q_0edcf939-f4bf-4aba-84fe-8e44a12fac21/registry-server/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.975330 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/extract-content/0.log" Mar 18 14:41:09 crc kubenswrapper[4921]: I0318 14:41:09.997386 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/extract-utilities/0.log" Mar 18 14:41:10 crc kubenswrapper[4921]: I0318 14:41:10.032503 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/extract-content/0.log" Mar 18 14:41:10 crc kubenswrapper[4921]: I0318 14:41:10.149835 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rm29k_f56b6108-5b3e-40ac-b8f6-f977464feb3b/registry-server/0.log" Mar 18 14:41:10 crc kubenswrapper[4921]: I0318 14:41:10.269002 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/extract-utilities/0.log" Mar 18 14:41:10 crc kubenswrapper[4921]: I0318 14:41:10.302597 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/extract-content/0.log" Mar 18 14:41:11 crc kubenswrapper[4921]: I0318 14:41:11.268969 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k4g8n_c452e3b2-05ab-4e09-a6a9-e59aee2e30cd/registry-server/0.log" Mar 18 14:41:17 crc kubenswrapper[4921]: I0318 14:41:17.080870 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:41:17 crc kubenswrapper[4921]: I0318 14:41:17.081505 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:41:18 crc kubenswrapper[4921]: I0318 14:41:18.511480 4921 scope.go:117] "RemoveContainer" containerID="310848713c98dad2c605ceb68f75e6a3cd47434bd384d49e2405858cc9720b45" Mar 18 14:41:26 crc kubenswrapper[4921]: I0318 14:41:26.497120 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5c775898b6-28rgz_bffc748b-76e8-4aa6-9a6f-684d5cc7cfd8/prometheus-operator-admission-webhook/0.log" Mar 18 14:41:26 crc kubenswrapper[4921]: I0318 14:41:26.499587 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-qhmx7_c975b167-f2f3-4d74-9774-8f1734dac995/prometheus-operator/0.log" Mar 18 14:41:26 crc kubenswrapper[4921]: I0318 14:41:26.533386 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5c775898b6-fxp5x_545fdd49-888f-466d-a528-ed4c2a57ce42/prometheus-operator-admission-webhook/0.log" Mar 18 14:41:26 crc kubenswrapper[4921]: I0318 14:41:26.683475 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-59cbb5b9bc-jz6kn_0b764acb-5fd6-4b33-b653-597e0b72d927/perses-operator/0.log" Mar 18 14:41:26 crc kubenswrapper[4921]: I0318 14:41:26.697405 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-bbtjj_7de24d45-dec2-4515-b5fe-f3ad4bc7fd8f/operator/0.log" Mar 18 14:41:33 crc kubenswrapper[4921]: E0318 14:41:33.910627 4921 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.200:52414->38.129.56.200:45633: write tcp 38.129.56.200:52414->38.129.56.200:45633: write: broken pipe Mar 18 14:41:37 crc kubenswrapper[4921]: E0318 14:41:37.667223 4921 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.200:52652->38.129.56.200:45633: read tcp 38.129.56.200:52652->38.129.56.200:45633: read: connection reset by peer Mar 18 14:41:47 crc kubenswrapper[4921]: I0318 14:41:47.081949 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:41:47 crc kubenswrapper[4921]: I0318 14:41:47.083940 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.165486 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564082-z65nn"] Mar 18 14:42:00 crc kubenswrapper[4921]: E0318 14:42:00.166883 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="extract-utilities" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.166906 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="extract-utilities" Mar 18 14:42:00 crc kubenswrapper[4921]: E0318 14:42:00.166997 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="registry-server" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.167014 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="registry-server" Mar 18 14:42:00 crc kubenswrapper[4921]: E0318 14:42:00.167041 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="extract-content" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.167052 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="extract-content" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.167393 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6ee229-34a6-426e-9367-ebb90c358d4b" containerName="registry-server" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.169217 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.174015 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.174451 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.174623 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.178916 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-z65nn"] Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.239929 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89r2v\" (UniqueName: \"kubernetes.io/projected/2d1a262f-62fa-47f6-ac60-3b015eaf1f5f-kube-api-access-89r2v\") pod \"auto-csr-approver-29564082-z65nn\" (UID: \"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f\") " pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.342212 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89r2v\" (UniqueName: \"kubernetes.io/projected/2d1a262f-62fa-47f6-ac60-3b015eaf1f5f-kube-api-access-89r2v\") pod \"auto-csr-approver-29564082-z65nn\" (UID: \"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f\") " pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.369317 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89r2v\" (UniqueName: \"kubernetes.io/projected/2d1a262f-62fa-47f6-ac60-3b015eaf1f5f-kube-api-access-89r2v\") pod \"auto-csr-approver-29564082-z65nn\" (UID: \"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f\") " pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.499886 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:00 crc kubenswrapper[4921]: W0318 14:42:00.984339 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d1a262f_62fa_47f6_ac60_3b015eaf1f5f.slice/crio-13556008eb688407369e14814441e036670e3854e2e5c913743abb1c5ba2b00a WatchSource:0}: Error finding container 13556008eb688407369e14814441e036670e3854e2e5c913743abb1c5ba2b00a: Status 404 returned error can't find the container with id 13556008eb688407369e14814441e036670e3854e2e5c913743abb1c5ba2b00a Mar 18 14:42:00 crc kubenswrapper[4921]: I0318 14:42:00.993592 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564082-z65nn"] Mar 18 14:42:01 crc kubenswrapper[4921]: I0318 14:42:01.927415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-z65nn" event={"ID":"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f","Type":"ContainerStarted","Data":"13556008eb688407369e14814441e036670e3854e2e5c913743abb1c5ba2b00a"} Mar 18 14:42:03 crc kubenswrapper[4921]: I0318 14:42:03.954588 4921 generic.go:334] "Generic (PLEG): container finished" podID="2d1a262f-62fa-47f6-ac60-3b015eaf1f5f" containerID="86df9e8384e37c705dd5d680c1595578346bec24594ec175fe3c4678cbc2b006" exitCode=0 Mar 18 14:42:03 crc kubenswrapper[4921]: I0318 14:42:03.954641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-z65nn" event={"ID":"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f","Type":"ContainerDied","Data":"86df9e8384e37c705dd5d680c1595578346bec24594ec175fe3c4678cbc2b006"} Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.623950 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.670461 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89r2v\" (UniqueName: \"kubernetes.io/projected/2d1a262f-62fa-47f6-ac60-3b015eaf1f5f-kube-api-access-89r2v\") pod \"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f\" (UID: \"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f\") " Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.679973 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1a262f-62fa-47f6-ac60-3b015eaf1f5f-kube-api-access-89r2v" (OuterVolumeSpecName: "kube-api-access-89r2v") pod "2d1a262f-62fa-47f6-ac60-3b015eaf1f5f" (UID: "2d1a262f-62fa-47f6-ac60-3b015eaf1f5f"). InnerVolumeSpecName "kube-api-access-89r2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.774046 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89r2v\" (UniqueName: \"kubernetes.io/projected/2d1a262f-62fa-47f6-ac60-3b015eaf1f5f-kube-api-access-89r2v\") on node \"crc\" DevicePath \"\"" Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.981982 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564082-z65nn" event={"ID":"2d1a262f-62fa-47f6-ac60-3b015eaf1f5f","Type":"ContainerDied","Data":"13556008eb688407369e14814441e036670e3854e2e5c913743abb1c5ba2b00a"} Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.982050 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564082-z65nn" Mar 18 14:42:05 crc kubenswrapper[4921]: I0318 14:42:05.982057 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13556008eb688407369e14814441e036670e3854e2e5c913743abb1c5ba2b00a" Mar 18 14:42:06 crc kubenswrapper[4921]: I0318 14:42:06.709135 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-2tz9f"] Mar 18 14:42:06 crc kubenswrapper[4921]: I0318 14:42:06.720554 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564076-2tz9f"] Mar 18 14:42:07 crc kubenswrapper[4921]: I0318 14:42:07.226550 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d796a30-949c-46fe-8cac-21ec75723bc7" path="/var/lib/kubelet/pods/7d796a30-949c-46fe-8cac-21ec75723bc7/volumes" Mar 18 14:42:17 crc kubenswrapper[4921]: I0318 14:42:17.080941 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fsfj7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 14:42:17 crc kubenswrapper[4921]: I0318 14:42:17.081589 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 14:42:17 crc kubenswrapper[4921]: I0318 14:42:17.081644 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" Mar 18 14:42:17 crc kubenswrapper[4921]: I0318 14:42:17.082744 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4"} pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 14:42:17 crc kubenswrapper[4921]: I0318 14:42:17.083326 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" containerName="machine-config-daemon" containerID="cri-o://313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" gracePeriod=600 Mar 18 14:42:18 crc kubenswrapper[4921]: I0318 14:42:18.132843 4921 generic.go:334] "Generic (PLEG): container finished" podID="509553d8-b894-456c-a45e-665e8497cdbc" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" exitCode=0 Mar 18 14:42:18 crc kubenswrapper[4921]: I0318 14:42:18.133104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" event={"ID":"509553d8-b894-456c-a45e-665e8497cdbc","Type":"ContainerDied","Data":"313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4"} Mar 18 14:42:18 crc kubenswrapper[4921]: I0318 14:42:18.133364 4921 scope.go:117] "RemoveContainer" containerID="d7f9507b3191d3ee30fbb7b57c8db74856ae4b7f60e965956118b9fe19bf769b" Mar 18 14:42:18 crc kubenswrapper[4921]: E0318 14:42:18.297605 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:42:18 crc kubenswrapper[4921]: I0318 14:42:18.592558 4921 scope.go:117] "RemoveContainer" containerID="87ed8adc92edd5c8806fff7a76b905f86aa1db5d4e323a21c4d745796bb7865d" Mar 18 14:42:19 crc kubenswrapper[4921]: I0318 14:42:19.152828 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:42:19 crc kubenswrapper[4921]: E0318 14:42:19.153581 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:42:31 crc kubenswrapper[4921]: I0318 14:42:31.233021 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:42:31 crc kubenswrapper[4921]: E0318 14:42:31.234478 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:42:42 crc kubenswrapper[4921]: I0318 14:42:42.209715 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:42:42 crc kubenswrapper[4921]: E0318 14:42:42.210763 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:42:57 crc kubenswrapper[4921]: I0318 14:42:57.209731 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:42:57 crc kubenswrapper[4921]: E0318 14:42:57.210653 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:43:10 crc kubenswrapper[4921]: I0318 14:43:10.208940 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:43:10 crc kubenswrapper[4921]: E0318 14:43:10.209833 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.761571 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bbbgn"] Mar 18 14:43:12 crc kubenswrapper[4921]: E0318 14:43:12.762758 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1a262f-62fa-47f6-ac60-3b015eaf1f5f" containerName="oc" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.762777 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1a262f-62fa-47f6-ac60-3b015eaf1f5f" containerName="oc" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.763094 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1a262f-62fa-47f6-ac60-3b015eaf1f5f" containerName="oc" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.764969 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.792674 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbbgn"] Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.818436 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqc5w\" (UniqueName: \"kubernetes.io/projected/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-kube-api-access-hqc5w\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.818502 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-utilities\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.818528 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-catalog-content\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.924445 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqc5w\" (UniqueName: \"kubernetes.io/projected/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-kube-api-access-hqc5w\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.924553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-utilities\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.924590 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-catalog-content\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.925496 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-catalog-content\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.925618 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-utilities\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:12 crc kubenswrapper[4921]: I0318 14:43:12.994995 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqc5w\" (UniqueName: \"kubernetes.io/projected/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-kube-api-access-hqc5w\") pod \"redhat-marketplace-bbbgn\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:13 crc kubenswrapper[4921]: I0318 14:43:13.092609 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:13 crc kubenswrapper[4921]: I0318 14:43:13.587964 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbbgn"] Mar 18 14:43:13 crc kubenswrapper[4921]: I0318 14:43:13.815663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerStarted","Data":"3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f"} Mar 18 14:43:13 crc kubenswrapper[4921]: I0318 14:43:13.815878 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerStarted","Data":"df460ec0f5a28b056e64be06650f653136ed1f939ccbdd2f363fe67724b05f7d"} Mar 18 14:43:14 crc kubenswrapper[4921]: I0318 14:43:14.831713 4921 generic.go:334] "Generic (PLEG): container finished" podID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerID="3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f" exitCode=0 Mar 18 14:43:14 crc kubenswrapper[4921]: I0318 14:43:14.831799 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerDied","Data":"3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f"} Mar 18 14:43:16 crc kubenswrapper[4921]: I0318 14:43:16.862100 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerStarted","Data":"bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb"} Mar 18 14:43:17 crc kubenswrapper[4921]: I0318 14:43:17.875802 4921 generic.go:334] "Generic (PLEG): container finished" podID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerID="bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb" exitCode=0 Mar 18 14:43:17 crc kubenswrapper[4921]: I0318 14:43:17.875864 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerDied","Data":"bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb"} Mar 18 14:43:18 crc kubenswrapper[4921]: I0318 14:43:18.890911 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerStarted","Data":"eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b"} Mar 18 14:43:18 crc kubenswrapper[4921]: I0318 14:43:18.916807 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bbbgn" podStartSLOduration=3.2613991159999998 podStartE2EDuration="6.916784251s" podCreationTimestamp="2026-03-18 14:43:12 +0000 UTC" firstStartedPulling="2026-03-18 14:43:14.834819467 +0000 UTC m=+9214.384740106" lastFinishedPulling="2026-03-18 14:43:18.490204592 +0000 UTC m=+9218.040125241" observedRunningTime="2026-03-18 14:43:18.90761605 +0000 UTC m=+9218.457536689" watchObservedRunningTime="2026-03-18 14:43:18.916784251 +0000 UTC m=+9218.466704890" Mar 18 14:43:23 crc kubenswrapper[4921]: I0318 14:43:23.093431 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:23 crc kubenswrapper[4921]: I0318 14:43:23.094019 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:23 crc kubenswrapper[4921]: I0318 14:43:23.141366 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:24 crc kubenswrapper[4921]: I0318 14:43:24.010920 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:24 crc kubenswrapper[4921]: I0318 14:43:24.066314 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbbgn"] Mar 18 14:43:25 crc kubenswrapper[4921]: I0318 14:43:25.209682 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:43:25 crc kubenswrapper[4921]: E0318 14:43:25.209959 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:43:25 crc kubenswrapper[4921]: I0318 14:43:25.974493 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bbbgn" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="registry-server" containerID="cri-o://eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b" gracePeriod=2 Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.624712 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.757920 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqc5w\" (UniqueName: \"kubernetes.io/projected/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-kube-api-access-hqc5w\") pod \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.758464 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-utilities\") pod \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.758701 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-catalog-content\") pod \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\" (UID: \"568e33a6-3bb8-4906-8a8c-43d71e0e51b6\") " Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.759247 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-utilities" (OuterVolumeSpecName: "utilities") pod "568e33a6-3bb8-4906-8a8c-43d71e0e51b6" (UID: "568e33a6-3bb8-4906-8a8c-43d71e0e51b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.759775 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.763028 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-kube-api-access-hqc5w" (OuterVolumeSpecName: "kube-api-access-hqc5w") pod "568e33a6-3bb8-4906-8a8c-43d71e0e51b6" (UID: "568e33a6-3bb8-4906-8a8c-43d71e0e51b6"). InnerVolumeSpecName "kube-api-access-hqc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.787127 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "568e33a6-3bb8-4906-8a8c-43d71e0e51b6" (UID: "568e33a6-3bb8-4906-8a8c-43d71e0e51b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.861545 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqc5w\" (UniqueName: \"kubernetes.io/projected/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-kube-api-access-hqc5w\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.861581 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568e33a6-3bb8-4906-8a8c-43d71e0e51b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.984075 4921 generic.go:334] "Generic (PLEG): container finished" podID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerID="eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b" exitCode=0 Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.984154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerDied","Data":"eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b"} Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.984185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbbgn" event={"ID":"568e33a6-3bb8-4906-8a8c-43d71e0e51b6","Type":"ContainerDied","Data":"df460ec0f5a28b056e64be06650f653136ed1f939ccbdd2f363fe67724b05f7d"} Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.984208 4921 scope.go:117] "RemoveContainer" containerID="eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b" Mar 18 14:43:26 crc kubenswrapper[4921]: I0318 14:43:26.984356 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbbgn" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.021262 4921 scope.go:117] "RemoveContainer" containerID="bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.022505 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbbgn"] Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.036630 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbbgn"] Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.050336 4921 scope.go:117] "RemoveContainer" containerID="3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.095335 4921 scope.go:117] "RemoveContainer" containerID="eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b" Mar 18 14:43:27 crc kubenswrapper[4921]: E0318 14:43:27.095812 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b\": container with ID starting with eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b not found: ID does not exist" containerID="eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.095841 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b"} err="failed to get container status \"eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b\": rpc error: code = NotFound desc = could not find container \"eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b\": container with ID starting with eb9aa36699257aa7fd892b67efe6e2946dc56423ff3ac26fd7257a8a6692fe3b not found: ID does not exist" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.095861 4921 scope.go:117] "RemoveContainer" containerID="bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb" Mar 18 14:43:27 crc kubenswrapper[4921]: E0318 14:43:27.096279 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb\": container with ID starting with bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb not found: ID does not exist" containerID="bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.096369 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb"} err="failed to get container status \"bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb\": rpc error: code = NotFound desc = could not find container \"bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb\": container with ID starting with bf1a64aecb9b3f4f56c5323c5476c161284c05634675296f8f7a354a93de49cb not found: ID does not exist" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.096461 4921 scope.go:117] "RemoveContainer" containerID="3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f" Mar 18 14:43:27 crc kubenswrapper[4921]: E0318 14:43:27.096895 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f\": container with ID starting with 3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f not found: ID does not exist" containerID="3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.096918 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f"} err="failed to get container status \"3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f\": rpc error: code = NotFound desc = could not find container \"3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f\": container with ID starting with 3b6f72cd25245bdf98c3197305275a4071086542644336394f16c2ab64c8ad3f not found: ID does not exist" Mar 18 14:43:27 crc kubenswrapper[4921]: I0318 14:43:27.224740 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" path="/var/lib/kubelet/pods/568e33a6-3bb8-4906-8a8c-43d71e0e51b6/volumes" Mar 18 14:43:36 crc kubenswrapper[4921]: I0318 14:43:36.209723 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:43:36 crc kubenswrapper[4921]: E0318 14:43:36.210599 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:43:45 crc kubenswrapper[4921]: I0318 14:43:45.267458 4921 generic.go:334] "Generic (PLEG): container finished" podID="b130adda-ecdd-4197-91c8-39c7817474a8" containerID="43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176" exitCode=0 Mar 18 14:43:45 crc kubenswrapper[4921]: I0318 14:43:45.267559 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7752b/must-gather-w5rnd" event={"ID":"b130adda-ecdd-4197-91c8-39c7817474a8","Type":"ContainerDied","Data":"43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176"} Mar 18 14:43:45 crc kubenswrapper[4921]: I0318 14:43:45.269353 4921 scope.go:117] "RemoveContainer" containerID="43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176" Mar 18 14:43:45 crc kubenswrapper[4921]: I0318 14:43:45.766925 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7752b_must-gather-w5rnd_b130adda-ecdd-4197-91c8-39c7817474a8/gather/0.log" Mar 18 14:43:51 crc kubenswrapper[4921]: I0318 14:43:51.218275 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:43:51 crc kubenswrapper[4921]: E0318 14:43:51.226870 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.300886 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7752b/must-gather-w5rnd"] Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.301728 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7752b/must-gather-w5rnd" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="copy" containerID="cri-o://aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29" gracePeriod=2 Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.309796 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7752b/must-gather-w5rnd"] Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.779051 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7752b_must-gather-w5rnd_b130adda-ecdd-4197-91c8-39c7817474a8/copy/0.log" Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.779938 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.932236 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b130adda-ecdd-4197-91c8-39c7817474a8-must-gather-output\") pod \"b130adda-ecdd-4197-91c8-39c7817474a8\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.932455 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49mh\" (UniqueName: \"kubernetes.io/projected/b130adda-ecdd-4197-91c8-39c7817474a8-kube-api-access-g49mh\") pod \"b130adda-ecdd-4197-91c8-39c7817474a8\" (UID: \"b130adda-ecdd-4197-91c8-39c7817474a8\") " Mar 18 14:43:54 crc kubenswrapper[4921]: I0318 14:43:54.973788 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b130adda-ecdd-4197-91c8-39c7817474a8-kube-api-access-g49mh" (OuterVolumeSpecName: "kube-api-access-g49mh") pod "b130adda-ecdd-4197-91c8-39c7817474a8" (UID: "b130adda-ecdd-4197-91c8-39c7817474a8"). InnerVolumeSpecName "kube-api-access-g49mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.037517 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g49mh\" (UniqueName: \"kubernetes.io/projected/b130adda-ecdd-4197-91c8-39c7817474a8-kube-api-access-g49mh\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.164372 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b130adda-ecdd-4197-91c8-39c7817474a8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b130adda-ecdd-4197-91c8-39c7817474a8" (UID: "b130adda-ecdd-4197-91c8-39c7817474a8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.223716 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" path="/var/lib/kubelet/pods/b130adda-ecdd-4197-91c8-39c7817474a8/volumes" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.241083 4921 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b130adda-ecdd-4197-91c8-39c7817474a8-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.406281 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7752b_must-gather-w5rnd_b130adda-ecdd-4197-91c8-39c7817474a8/copy/0.log" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.406594 4921 generic.go:334] "Generic (PLEG): container finished" podID="b130adda-ecdd-4197-91c8-39c7817474a8" containerID="aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29" exitCode=143 Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.406646 4921 scope.go:117] "RemoveContainer" containerID="aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.406790 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7752b/must-gather-w5rnd" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.430486 4921 scope.go:117] "RemoveContainer" containerID="43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.480427 4921 scope.go:117] "RemoveContainer" containerID="aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29" Mar 18 14:43:55 crc kubenswrapper[4921]: E0318 14:43:55.480963 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29\": container with ID starting with aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29 not found: ID does not exist" containerID="aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.481065 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29"} err="failed to get container status \"aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29\": rpc error: code = NotFound desc = could not find container \"aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29\": container with ID starting with aeeeb89e68d3e0484c0838a95119e4e7a54d153fdb20dc133f1543c96795aa29 not found: ID does not exist" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.481173 4921 scope.go:117] "RemoveContainer" containerID="43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176" Mar 18 14:43:55 crc kubenswrapper[4921]: E0318 14:43:55.482202 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176\": container with ID starting with 43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176 not found: ID does not exist" containerID="43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176" Mar 18 14:43:55 crc kubenswrapper[4921]: I0318 14:43:55.482291 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176"} err="failed to get container status \"43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176\": rpc error: code = NotFound desc = could not find container \"43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176\": container with ID starting with 43253a1649f6566bbaf0a1012c6c8262e22a19c1cffb9674b5b6a3efa9eaa176 not found: ID does not exist" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.158135 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564084-ln6q8"] Mar 18 14:44:00 crc kubenswrapper[4921]: E0318 14:44:00.159485 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="gather" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159504 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="gather" Mar 18 14:44:00 crc kubenswrapper[4921]: E0318 14:44:00.159574 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="copy" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159582 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="copy" Mar 18 14:44:00 crc kubenswrapper[4921]: E0318 14:44:00.159594 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="registry-server" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159600 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="registry-server" Mar 18 14:44:00 crc kubenswrapper[4921]: E0318 14:44:00.159613 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="extract-content" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159618 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="extract-content" Mar 18 14:44:00 crc kubenswrapper[4921]: E0318 14:44:00.159646 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="extract-utilities" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159654 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="extract-utilities" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159942 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="568e33a6-3bb8-4906-8a8c-43d71e0e51b6" containerName="registry-server" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.159993 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="gather" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.160012 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b130adda-ecdd-4197-91c8-39c7817474a8" containerName="copy" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.161083 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.165699 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.165917 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.166081 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.178587 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564084-ln6q8"] Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.259577 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shb2\" (UniqueName: \"kubernetes.io/projected/71bab288-e8fc-45e4-9b42-819345e102a0-kube-api-access-8shb2\") pod \"auto-csr-approver-29564084-ln6q8\" (UID: \"71bab288-e8fc-45e4-9b42-819345e102a0\") " pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.362987 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shb2\" (UniqueName: \"kubernetes.io/projected/71bab288-e8fc-45e4-9b42-819345e102a0-kube-api-access-8shb2\") pod \"auto-csr-approver-29564084-ln6q8\" (UID: \"71bab288-e8fc-45e4-9b42-819345e102a0\") " pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.416052 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shb2\" (UniqueName: \"kubernetes.io/projected/71bab288-e8fc-45e4-9b42-819345e102a0-kube-api-access-8shb2\") pod \"auto-csr-approver-29564084-ln6q8\" (UID: \"71bab288-e8fc-45e4-9b42-819345e102a0\") " pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:00 crc kubenswrapper[4921]: I0318 14:44:00.489174 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:01 crc kubenswrapper[4921]: I0318 14:44:01.035382 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564084-ln6q8"] Mar 18 14:44:01 crc kubenswrapper[4921]: I0318 14:44:01.473589 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" event={"ID":"71bab288-e8fc-45e4-9b42-819345e102a0","Type":"ContainerStarted","Data":"c0dccf31a7c532b2f6dfc6a7c4125147cb1fd70d70463786b62855bb4e666405"} Mar 18 14:44:02 crc kubenswrapper[4921]: I0318 14:44:02.484208 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" event={"ID":"71bab288-e8fc-45e4-9b42-819345e102a0","Type":"ContainerStarted","Data":"17951d5d2a079083dd02f45f0a2fcef10a2a2f8495b2a1992e97e544d0b188b6"} Mar 18 14:44:02 crc kubenswrapper[4921]: I0318 14:44:02.507775 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" podStartSLOduration=1.530809281 podStartE2EDuration="2.507752648s" podCreationTimestamp="2026-03-18 14:44:00 +0000 UTC" firstStartedPulling="2026-03-18 14:44:01.053966888 +0000 UTC m=+9260.603887527" lastFinishedPulling="2026-03-18 14:44:02.030910255 +0000 UTC m=+9261.580830894" observedRunningTime="2026-03-18 14:44:02.500194732 +0000 UTC m=+9262.050115401" watchObservedRunningTime="2026-03-18 14:44:02.507752648 +0000 UTC m=+9262.057673287" Mar 18 14:44:03 crc kubenswrapper[4921]: I0318 14:44:03.209653 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:44:03 crc kubenswrapper[4921]: E0318 14:44:03.210685 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:44:03 crc kubenswrapper[4921]: I0318 14:44:03.508713 4921 generic.go:334] "Generic (PLEG): container finished" podID="71bab288-e8fc-45e4-9b42-819345e102a0" containerID="17951d5d2a079083dd02f45f0a2fcef10a2a2f8495b2a1992e97e544d0b188b6" exitCode=0 Mar 18 14:44:03 crc kubenswrapper[4921]: I0318 14:44:03.509045 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" event={"ID":"71bab288-e8fc-45e4-9b42-819345e102a0","Type":"ContainerDied","Data":"17951d5d2a079083dd02f45f0a2fcef10a2a2f8495b2a1992e97e544d0b188b6"} Mar 18 14:44:04 crc kubenswrapper[4921]: I0318 14:44:04.892824 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:04 crc kubenswrapper[4921]: I0318 14:44:04.970962 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shb2\" (UniqueName: \"kubernetes.io/projected/71bab288-e8fc-45e4-9b42-819345e102a0-kube-api-access-8shb2\") pod \"71bab288-e8fc-45e4-9b42-819345e102a0\" (UID: \"71bab288-e8fc-45e4-9b42-819345e102a0\") " Mar 18 14:44:04 crc kubenswrapper[4921]: I0318 14:44:04.979613 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bab288-e8fc-45e4-9b42-819345e102a0-kube-api-access-8shb2" (OuterVolumeSpecName: "kube-api-access-8shb2") pod "71bab288-e8fc-45e4-9b42-819345e102a0" (UID: "71bab288-e8fc-45e4-9b42-819345e102a0"). InnerVolumeSpecName "kube-api-access-8shb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:44:05 crc kubenswrapper[4921]: I0318 14:44:05.074017 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shb2\" (UniqueName: \"kubernetes.io/projected/71bab288-e8fc-45e4-9b42-819345e102a0-kube-api-access-8shb2\") on node \"crc\" DevicePath \"\"" Mar 18 14:44:05 crc kubenswrapper[4921]: I0318 14:44:05.530896 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" event={"ID":"71bab288-e8fc-45e4-9b42-819345e102a0","Type":"ContainerDied","Data":"c0dccf31a7c532b2f6dfc6a7c4125147cb1fd70d70463786b62855bb4e666405"} Mar 18 14:44:05 crc kubenswrapper[4921]: I0318 14:44:05.530963 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0dccf31a7c532b2f6dfc6a7c4125147cb1fd70d70463786b62855bb4e666405" Mar 18 14:44:05 crc kubenswrapper[4921]: I0318 14:44:05.531048 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564084-ln6q8" Mar 18 14:44:05 crc kubenswrapper[4921]: I0318 14:44:05.579901 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-kqmdw"] Mar 18 14:44:05 crc kubenswrapper[4921]: I0318 14:44:05.593362 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564078-kqmdw"] Mar 18 14:44:07 crc kubenswrapper[4921]: I0318 14:44:07.222417 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a400f81-0112-4f11-9718-5720e3e6e2ef" path="/var/lib/kubelet/pods/8a400f81-0112-4f11-9718-5720e3e6e2ef/volumes" Mar 18 14:44:15 crc kubenswrapper[4921]: I0318 14:44:15.210554 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:44:15 crc kubenswrapper[4921]: E0318 14:44:15.211426 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:44:18 crc kubenswrapper[4921]: I0318 14:44:18.695749 4921 scope.go:117] "RemoveContainer" containerID="e9deb72624c0f7c62095c3b3cd36e3293c7e0bc2439e240c3f2e8356de121c8a" Mar 18 14:44:30 crc kubenswrapper[4921]: I0318 14:44:30.209222 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:44:30 crc kubenswrapper[4921]: E0318 14:44:30.210244 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:44:42 crc kubenswrapper[4921]: I0318 14:44:42.209381 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:44:42 crc kubenswrapper[4921]: E0318 14:44:42.210177 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:44:55 crc kubenswrapper[4921]: I0318 14:44:55.209671 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:44:55 crc kubenswrapper[4921]: E0318 14:44:55.217847 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.179169 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv"] Mar 18 14:45:00 crc kubenswrapper[4921]: E0318 14:45:00.180842 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bab288-e8fc-45e4-9b42-819345e102a0" containerName="oc" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.180868 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bab288-e8fc-45e4-9b42-819345e102a0" containerName="oc" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.181323 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bab288-e8fc-45e4-9b42-819345e102a0" containerName="oc" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.182845 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.189636 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.189954 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.229783 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv"] Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.321209 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56jw\" (UniqueName: \"kubernetes.io/projected/f7358d81-81ee-4caf-a86f-6f4be58061b5-kube-api-access-b56jw\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.321544 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7358d81-81ee-4caf-a86f-6f4be58061b5-secret-volume\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.321888 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7358d81-81ee-4caf-a86f-6f4be58061b5-config-volume\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.425480 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b56jw\" (UniqueName: \"kubernetes.io/projected/f7358d81-81ee-4caf-a86f-6f4be58061b5-kube-api-access-b56jw\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.425588 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7358d81-81ee-4caf-a86f-6f4be58061b5-secret-volume\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.425796 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7358d81-81ee-4caf-a86f-6f4be58061b5-config-volume\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.427062 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7358d81-81ee-4caf-a86f-6f4be58061b5-config-volume\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.670305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7358d81-81ee-4caf-a86f-6f4be58061b5-secret-volume\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.671975 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56jw\" (UniqueName: \"kubernetes.io/projected/f7358d81-81ee-4caf-a86f-6f4be58061b5-kube-api-access-b56jw\") pod \"collect-profiles-29564085-fb5sv\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:00 crc kubenswrapper[4921]: I0318 14:45:00.828288 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:01 crc kubenswrapper[4921]: I0318 14:45:01.365681 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv"] Mar 18 14:45:01 crc kubenswrapper[4921]: I0318 14:45:01.426710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" event={"ID":"f7358d81-81ee-4caf-a86f-6f4be58061b5","Type":"ContainerStarted","Data":"7a40c199a97cfb8787f7b2d4ba03373ab7c20781b3dfe395106f889ae450b946"} Mar 18 14:45:02 crc kubenswrapper[4921]: I0318 14:45:02.441006 4921 generic.go:334] "Generic (PLEG): container finished" podID="f7358d81-81ee-4caf-a86f-6f4be58061b5" containerID="febd9e331b662255f0a76973c8dd84b026aee36ccd5742f1eab67f020f13c61d" exitCode=0 Mar 18 14:45:02 crc kubenswrapper[4921]: I0318 14:45:02.441095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" event={"ID":"f7358d81-81ee-4caf-a86f-6f4be58061b5","Type":"ContainerDied","Data":"febd9e331b662255f0a76973c8dd84b026aee36ccd5742f1eab67f020f13c61d"} Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.826510 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.917814 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7358d81-81ee-4caf-a86f-6f4be58061b5-secret-volume\") pod \"f7358d81-81ee-4caf-a86f-6f4be58061b5\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.918853 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7358d81-81ee-4caf-a86f-6f4be58061b5-config-volume\") pod \"f7358d81-81ee-4caf-a86f-6f4be58061b5\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.919004 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b56jw\" (UniqueName: \"kubernetes.io/projected/f7358d81-81ee-4caf-a86f-6f4be58061b5-kube-api-access-b56jw\") pod \"f7358d81-81ee-4caf-a86f-6f4be58061b5\" (UID: \"f7358d81-81ee-4caf-a86f-6f4be58061b5\") " Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.927662 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7358d81-81ee-4caf-a86f-6f4be58061b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7358d81-81ee-4caf-a86f-6f4be58061b5" (UID: "f7358d81-81ee-4caf-a86f-6f4be58061b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.933282 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7358d81-81ee-4caf-a86f-6f4be58061b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7358d81-81ee-4caf-a86f-6f4be58061b5" (UID: "f7358d81-81ee-4caf-a86f-6f4be58061b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 14:45:03 crc kubenswrapper[4921]: I0318 14:45:03.933340 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7358d81-81ee-4caf-a86f-6f4be58061b5-kube-api-access-b56jw" (OuterVolumeSpecName: "kube-api-access-b56jw") pod "f7358d81-81ee-4caf-a86f-6f4be58061b5" (UID: "f7358d81-81ee-4caf-a86f-6f4be58061b5"). InnerVolumeSpecName "kube-api-access-b56jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.025721 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b56jw\" (UniqueName: \"kubernetes.io/projected/f7358d81-81ee-4caf-a86f-6f4be58061b5-kube-api-access-b56jw\") on node \"crc\" DevicePath \"\"" Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.025969 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7358d81-81ee-4caf-a86f-6f4be58061b5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.025980 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7358d81-81ee-4caf-a86f-6f4be58061b5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.470782 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" event={"ID":"f7358d81-81ee-4caf-a86f-6f4be58061b5","Type":"ContainerDied","Data":"7a40c199a97cfb8787f7b2d4ba03373ab7c20781b3dfe395106f889ae450b946"} Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.470833 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a40c199a97cfb8787f7b2d4ba03373ab7c20781b3dfe395106f889ae450b946" Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.470903 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564085-fb5sv" Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.906750 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6"] Mar 18 14:45:04 crc kubenswrapper[4921]: I0318 14:45:04.922769 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564040-lmrk6"] Mar 18 14:45:06 crc kubenswrapper[4921]: I0318 14:45:06.012483 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f65c2b9-1d0d-4f73-a228-4585b515c979" path="/var/lib/kubelet/pods/0f65c2b9-1d0d-4f73-a228-4585b515c979/volumes" Mar 18 14:45:10 crc kubenswrapper[4921]: I0318 14:45:10.209341 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:45:10 crc kubenswrapper[4921]: E0318 14:45:10.210074 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:45:19 crc kubenswrapper[4921]: I0318 14:45:19.324967 4921 scope.go:117] "RemoveContainer" containerID="2192aca7b19fda3fe1e04781bfee868a9befee5b4b7b1d20c5b4c37090d5874e" Mar 18 14:45:21 crc kubenswrapper[4921]: I0318 14:45:21.216382 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:45:21 crc kubenswrapper[4921]: E0318 14:45:21.217351 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:45:35 crc kubenswrapper[4921]: I0318 14:45:35.210012 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:45:35 crc kubenswrapper[4921]: E0318 14:45:35.210791 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:45:49 crc kubenswrapper[4921]: I0318 14:45:49.209500 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:45:49 crc kubenswrapper[4921]: E0318 14:45:49.210386 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.165624 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564086-nv5k9"] Mar 18 14:46:00 crc kubenswrapper[4921]: E0318 14:46:00.166828 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7358d81-81ee-4caf-a86f-6f4be58061b5" containerName="collect-profiles" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.166848 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7358d81-81ee-4caf-a86f-6f4be58061b5" containerName="collect-profiles" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.167448 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7358d81-81ee-4caf-a86f-6f4be58061b5" containerName="collect-profiles" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.169692 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.171873 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cbc49" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.172150 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.179269 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.201461 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564086-nv5k9"] Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.269193 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhl6\" (UniqueName: \"kubernetes.io/projected/0a9ba592-d84c-4882-8500-5720bbdfac91-kube-api-access-jzhl6\") pod \"auto-csr-approver-29564086-nv5k9\" (UID: \"0a9ba592-d84c-4882-8500-5720bbdfac91\") " pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.371869 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhl6\" (UniqueName: \"kubernetes.io/projected/0a9ba592-d84c-4882-8500-5720bbdfac91-kube-api-access-jzhl6\") pod \"auto-csr-approver-29564086-nv5k9\" (UID: \"0a9ba592-d84c-4882-8500-5720bbdfac91\") " pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.780532 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhl6\" (UniqueName: \"kubernetes.io/projected/0a9ba592-d84c-4882-8500-5720bbdfac91-kube-api-access-jzhl6\") pod \"auto-csr-approver-29564086-nv5k9\" (UID: \"0a9ba592-d84c-4882-8500-5720bbdfac91\") " pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:00 crc kubenswrapper[4921]: I0318 14:46:00.814203 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:01 crc kubenswrapper[4921]: I0318 14:46:01.358369 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564086-nv5k9"] Mar 18 14:46:01 crc kubenswrapper[4921]: I0318 14:46:01.386675 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 14:46:02 crc kubenswrapper[4921]: I0318 14:46:02.145748 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" event={"ID":"0a9ba592-d84c-4882-8500-5720bbdfac91","Type":"ContainerStarted","Data":"05eddc6ad2ad8d1425f0f6149885016038cec452c45b2905217eaeffa7f49aee"} Mar 18 14:46:02 crc kubenswrapper[4921]: I0318 14:46:02.209490 4921 scope.go:117] "RemoveContainer" containerID="313eef41167fbb3f9896fe5828b6d6496b01b61ebac9556a535152ffe849e5c4" Mar 18 14:46:02 crc kubenswrapper[4921]: E0318 14:46:02.209742 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fsfj7_openshift-machine-config-operator(509553d8-b894-456c-a45e-665e8497cdbc)\"" pod="openshift-machine-config-operator/machine-config-daemon-fsfj7" podUID="509553d8-b894-456c-a45e-665e8497cdbc" Mar 18 14:46:04 crc kubenswrapper[4921]: I0318 14:46:04.173650 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a9ba592-d84c-4882-8500-5720bbdfac91" containerID="7a5faa68f58e40ba79d3e45591f7e6190a475423dfc80a40803a5b5d3c7991fd" exitCode=0 Mar 18 14:46:04 crc kubenswrapper[4921]: I0318 14:46:04.173709 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" event={"ID":"0a9ba592-d84c-4882-8500-5720bbdfac91","Type":"ContainerDied","Data":"7a5faa68f58e40ba79d3e45591f7e6190a475423dfc80a40803a5b5d3c7991fd"} Mar 18 14:46:05 crc kubenswrapper[4921]: I0318 14:46:05.783206 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:05 crc kubenswrapper[4921]: I0318 14:46:05.900428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzhl6\" (UniqueName: \"kubernetes.io/projected/0a9ba592-d84c-4882-8500-5720bbdfac91-kube-api-access-jzhl6\") pod \"0a9ba592-d84c-4882-8500-5720bbdfac91\" (UID: \"0a9ba592-d84c-4882-8500-5720bbdfac91\") " Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.073131 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9ba592-d84c-4882-8500-5720bbdfac91-kube-api-access-jzhl6" (OuterVolumeSpecName: "kube-api-access-jzhl6") pod "0a9ba592-d84c-4882-8500-5720bbdfac91" (UID: "0a9ba592-d84c-4882-8500-5720bbdfac91"). InnerVolumeSpecName "kube-api-access-jzhl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.104892 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzhl6\" (UniqueName: \"kubernetes.io/projected/0a9ba592-d84c-4882-8500-5720bbdfac91-kube-api-access-jzhl6\") on node \"crc\" DevicePath \"\"" Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.195891 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" event={"ID":"0a9ba592-d84c-4882-8500-5720bbdfac91","Type":"ContainerDied","Data":"05eddc6ad2ad8d1425f0f6149885016038cec452c45b2905217eaeffa7f49aee"} Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.195939 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05eddc6ad2ad8d1425f0f6149885016038cec452c45b2905217eaeffa7f49aee" Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.196001 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564086-nv5k9" Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.872523 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-h2n9p"] Mar 18 14:46:06 crc kubenswrapper[4921]: I0318 14:46:06.888389 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564080-h2n9p"] Mar 18 14:46:07 crc kubenswrapper[4921]: I0318 14:46:07.242862 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863fca1b-c47d-4eff-bb21-65125bd4e2f8" path="/var/lib/kubelet/pods/863fca1b-c47d-4eff-bb21-65125bd4e2f8/volumes"